Making a filter plugin: what to do with transparent pixels??

JT
Posted By
Jace_tbl_003
Jun 30, 2005
Views
323
Replies
2
Status
Closed
Hi, I’m currently programming a filter plugin, and I’m kinda wondering what users would expect to happen with transparent pixels.

A problem arises when filtering layers that are not entirely filled (i.e. contain transparent parts).

Since pictures usually say more than a thousand words, please see what I mean here:

Before filtering:
http://img182.imageshack.us/img182/4119/layers11rq.png

After filtering:
http://img182.imageshack.us/img182/6780/layers28ho.png

For simplicity, let’s assume my filter just does a simple blur. Should I first "fix" the RGB data of the fully transparent pixels and then apply my filter normally? Cause if I don’t, by averaging pixels I might end up blurring white pixels through visible areas, while the white has actually no meaning at all.

Does anyone know how filters deal with this in general?

How to Improve Photoshop Performance

Learn how to optimize Photoshop for maximum speed, troubleshoot common issues, and keep your projects organized so that you can work faster than ever before!

NS
Nicholas Sherlock
Jun 30, 2005
wrote:
A problem arises when filtering layers that are not entirely filled (i.e. contain transparent parts).

For simplicity, let’s assume my filter just does a simple blur. Should I first "fix" the RGB data of the fully transparent pixels and then apply my filter normally? Cause if I don’t, by averaging pixels I might end up blurring white pixels through visible areas, while the white has actually no meaning at all.

Just use the opacity as a weight when adding values in for doing your blending. So opacity 0 pixels won’t be considered when adding nearby pixels, opacity 255 pixels will work as normal.

Cheers,
Nicholas Sherlock
S
Sean
Jul 12, 2005
On Fri, 01 Jul 2005 07:30:53 +1200, Nicholas Sherlock
reverently intoned upon the aether:

wrote:
A problem arises when filtering layers that are not entirely filled (i.e. contain transparent parts).

For simplicity, let’s assume my filter just does a simple blur. Should I first "fix" the RGB data of the fully transparent pixels and then apply my filter normally? Cause if I don’t, by averaging pixels I might end up blurring white pixels through visible areas, while the white has actually no meaning at all.

Just use the opacity as a weight when adding values in for doing your blending. So opacity 0 pixels won’t be considered when adding nearby pixels, opacity 255 pixels will work as normal.

Cheers,
Nicholas Sherlock

This could create some odd boundary artifacts for some algorithms. Not that I have the answer here, and this is a reasonable start, just be aware it is not a complete solution and could cause problems of its own. i.e., imagine an edge detector in a pure white image detecting edges due to the zero opacity. And after all, if one wanted to find edges in a mask, then would one not run the filter on the mask?

So computing as normal might be better for an edge detector. But then ignoring such pixels also has value. i.e., try lens blur with a mask and you will see that masked away pixels do not affect the blur. Hence, one can see that the proper solution is a function of the problem being solved.

hope this helps,

Sean

"In the End, we will remember not the words of our enemies, but the silence of our friends."

– Martin Luther King Jr. (1929-1968)

New Website
http://www.envisagement.com/
Last Updated 23 June 2005

Must-have mockup pack for every graphic designer 🔥🔥🔥

Easy-to-use drag-n-drop Photoshop scene creator with more than 2800 items.

Related Discussion Topics

Nice and short text about related topics in discussion sections