What’s wrong with this picture?

B
Posted By
bilglas
Jun 24, 2006
Views
296
Replies
7
Status
Closed
This is WEIRD: I’m using CS2 … I’ve got a simple TIF image … it’s about 4

How to Improve Photoshop Performance

Learn how to optimize Photoshop for maximum speed, troubleshoot common issues, and keep your projects organized so that you can work faster than ever before!

S
Stewy
Jun 24, 2006
In article <Th8ng.664$>,
"bilglas" wrote:

This is WEIRD: I’m using CS2 … I’ve got a simple TIF image … it’s about 4
T
Tacit
Jun 24, 2006
In article <Th8ng.664$>,
"bilglas" wrote:

This is WEIRD: I’m using CS2 … I’ve got a simple TIF image … it’s about 4
T
Tacit
Jun 24, 2006
In article ,
Stewy wrote:

I’ve just tried saving a jpeg as a tiff and it gave choices for Mac or PC – this could be your problem.

This won’t matter; any kind of computer can read either kind of TIFF. This difference is only important to a handful of applications that came out in the early 1990s that did not properly deal with the fact that Intel processors are little-endian.

The option should actually be "Intel and AMD/Everything Else" rather than "Mac/PC." It’s there because of a quirk in the way Intel processors handle data, a quirk that was originally introduced in the 8080 processor to help make it more backward compatible with the 4004 4-bit processor and has remained in Intel chips ever since.

Let’s suppose that you have a sixteen-bit number. It’s made up of two eight-bit bytes, like so:

12 34

or

44 16

or

1E 2D (these are hexadecimal–base 16–numbers, not decimal (base 10) numbers.)

Now, you would think that looking at those numbers, they go in order of greatest to least, so that the number "1234" would be 1,234. But because fo the quirk in the way Intel processors handle numbers, they are "little-endian." They flip the order of sixteen-bit numbers so that the little end comes first. The number 12 34 to an Intel processor is how you write 3,412–every pair of numbers is flipped.

So to an Intel processor, 1492 is written 92 14. 1000 is written 00 10. And so on.

The "Mac/PC" button in the TIFF save dialog tells the computer whether it should write sixteen-bit values in little-endian order for Intel processors, or in big-endian (normal) order for other computers. If you choose PC, every pair of eight-bit bytes is swapped. If you choose Mac, it is not.

Today, all decent TIFF reading programs on any computer can read a TIFF written either way, so it really doesn’t matter any more. I haven’t seen a program that cares which way a TIFF is written since about 1992 or so.

As an aside, I’ve heard there is a C compiler out there whose compiler directive to use little-endian byte order is

#use goddamn stupid Intel byte order

which I think is pretty funny. 🙂


Art, photography, shareware, polyamory, literature, kink: all at http://www.xeromag.com/franklin.html
Nanohazard, Geek shirts, and more: http://www.villaintees.com
N
nomail
Jun 24, 2006
tacit wrote:

In article <Th8ng.664$>,
"bilglas" wrote:

This is WEIRD: I’m using CS2 … I’ve got a simple TIF image … it’s about 4◊5 inches, RGB color mode, 72ppi, 8-bit. It IS flattened (background layer only) and it does NOT contain any alpha channels or paths. My goal is to save it as a JPG file. But it simply refuses … I get this error message: "Could not complete your request because of a program error."

Is it a 16-bit-per-channel TIFF? JPEG images need to be 8-bit, not 16-bit. Go to Image->Mode and see what it says.

16 bits images do not generate an error message. You simply do not see ‘JPEG’ as choice in the ‘Save as’ dialog.


Johan W. Elzenga johan<<at>>johanfoto.nl Editor / Photographer http://www.johanfoto.nl
T
Tacit
Jun 24, 2006
In article <1hhg8ut.lmvql31pblwoeN%>,
(Johan W. Elzenga) wrote:

16 bits images do not generate an error message. You simply do not see ‘JPEG’ as choice in the ‘Save as’ dialog.

I’ve seen some cases in CS where trying to use Save for Web on a 16-bit image generates a "Program Error" message.


Art, photography, shareware, polyamory, literature, kink: all at http://www.xeromag.com/franklin.html
Nanohazard, Geek shirts, and more: http://www.villaintees.com
N
nomail
Jun 24, 2006
tacit wrote:

In article <1hhg8ut.lmvql31pblwoeN%>,
(Johan W. Elzenga) wrote:

16 bits images do not generate an error message. You simply do not see ‘JPEG’ as choice in the ‘Save as’ dialog.

I’ve seen some cases in CS where trying to use Save for Web on a 16-bit image generates a "Program Error" message.

Perhaps so, but the OP already said that he *can* save the file as BMP. That proves the file isn’t 16 bits, because BMP doesn’t support 16 bits.


Johan W. Elzenga johan<<at>>johanfoto.nl Editor / Photographer http://www.johanfoto.nl
DC
Dave Cohen
Jun 28, 2006
tacit wrote:
This won’t matter; any kind of computer can read either kind of TIFF. This difference is only important to a handful of applications that came out in the early 1990s etc. etc…

Hey Tacit, thanks for this. I’ve often wondered but never thought to check it out.

Master Retouching Hair

Learn how to rescue details, remove flyaways, add volume, and enhance the definition of hair in any photo. We break down every tool and technique in Photoshop to get picture-perfect hair, every time.

Related Discussion Topics

Nice and short text about related topics in discussion sections