BREAKING NEWS: The end of JPEG is in sight

U
Posted By
un
Sep 30, 2005
Views
4341
Replies
118
Status
Closed
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock the DI world.

WHAT’S NEXT
Honey, I Shrunk the JPEG
By Shailaja Neelakantan, September 21, 2005
BUSINESS 2.0

If downloading digital photos stalls your PC, spare a thought for the data networks in hospitals. A midsize hospital typically gets 60 requests every hour for MRIs and echocardiograms. At 10 megabytes apiece, the enormous images can quickly cripple a network.

Enter 25-year-old Arvind Thiagarajan, co-founder of Singapore-based startup MatrixView, who wants to revolutionize digital imaging. The data-compression algorithm he invented shrinks images into a format called a MatrixView Universal, or MVU, which is 15 to 300 percent smaller than a JPEG. But unlike a JPEG, which omits details, an MVU is as precise as the original. "Data loss is unacceptable in medical diagnosis," Thiagarajan says. That’s why the startup is focusing on health care first. A well-known hospital in Bangalore is using the technology, and MatrixView plans to ink deals in the coming year with several Fortune 100 health-care companies in the United States. MatrixView is also targeting other subsets of the $9 billion U.S. digital-imaging market. Right now it’s negotiating with chipmakers to embed the technology in cameras and fit more files on storage cards. MRIs today, vacation snaps tomorrow.

http://www.business2.com/b2/web/articles/0,17863,1106847,00. html

http://www.matrixview.com/

Download white paper:
http://matrixview.com/files/ABO%20white%20paper.pdf

How to Master Sharpening in Photoshop

Give your photos a professional finish with sharpening in Photoshop. Learn to enhance details, create contrast, and prepare your images for print, web, and social media.

MI
Matt Ion
Sep 30, 2005
Unless it’s freely available to all developers, and doesn’t include some cockeyed protection scheme that makes it difficult for one to backup/edit/copy one’s own pictures, it’ll never fly for the mass market.

+/- wrote:

Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock the DI world.
WHAT’S NEXT
Honey, I Shrunk the JPEG
By Shailaja Neelakantan, September 21, 2005
BUSINESS 2.0

If downloading digital photos stalls your PC, spare a thought for the data networks in hospitals. A midsize hospital typically gets 60 requests every hour for MRIs and echocardiograms. At 10 megabytes apiece, the enormous images can quickly cripple a network.

Enter 25-year-old Arvind Thiagarajan, co-founder of Singapore-based startup MatrixView, who wants to revolutionize digital imaging. The data-compression algorithm he invented shrinks images into a format called a MatrixView Universal, or MVU, which is 15 to 300 percent smaller than a JPEG. But unlike a JPEG, which omits details, an MVU is as precise as the original. "Data loss is unacceptable in medical diagnosis," Thiagarajan says. That’s why the startup is focusing on health care first. A well-known hospital in Bangalore is using the technology, and MatrixView plans to ink deals in the coming year with several Fortune 100 health-care companies in the United States. MatrixView is also targeting other subsets of the $9 billion U.S. digital-imaging market. Right now it’s negotiating with chipmakers to embed the technology in cameras and fit more files on storage cards. MRIs today, vacation snaps tomorrow.

http://www.business2.com/b2/web/articles/0,17863,1106847,00. html
http://www.matrixview.com/

Download white paper:
http://matrixview.com/files/ABO%20white%20paper.pdf


avast! Antivirus: Outbound message clean.
Virus Database (VPS): 0539-2, 09/29/2005
Tested on: 9/29/2005 11:16:05 PM
avast! – copyright (c) 1988-2005 ALWIL Software.
http://www.avast.com
RB
Randy Berbaum
Sep 30, 2005
In rec.photo.digital Matt Ion wrote:
: Unless it’s freely available to all developers, and doesn’t include some : cockeyed protection scheme that makes it difficult for one to : backup/edit/copy one’s own pictures, it’ll never fly for the mass market.

Also nothing is said about processing time. If a sufficiently complex program is run, it is very possible to compress any photo much further than any currently used photo image. For the compression of an x-ray or MRI is "fast" if the result shows up in a few min while the dr and patient are walking back to the dr’s office. But how many of us are going to be happy waiting 2 min between shots on our digital camera, just to save 1/3 to 1/2 the memory space. Personally I think that I would rather purchase more memory than to have to wait several min (or even 10’s of seconds) between normal shots. JMHO

Now, if such a program were developed for archiving photos in more compact but lossless forms, It could have a big impact.

Randy

==========
Randy Berbaum
Champaign, IL
C
casioculture
Sep 30, 2005
Matt Ion wrote:
Unless it’s freely available to all developers, and doesn’t include some cockeyed protection scheme that makes it difficult for one to backup/edit/copy one’s own pictures, it’ll never fly for the mass market.

Indeed, Jpeg is an industrywide standard by a joint ISO/IEC and ITU-T committee.

AMD is now making 5Ghz processors. Broadband is now being offered at 24Mb. So hardware-bandwidth is no issue. Last thing the industry will do is entrust its data formats to a proprietary one from an obscure upstart.
T
Trevor
Sep 30, 2005
<big snip>

Download white paper:
http://matrixview.com/files/ABO%20white%20paper.pdf

data formats are like standards – there are so many to choose from. I woudl rather put my trust in JPEG2000, but that’s taking its time getting to the masses – anyone up to date on the Lizardtech claims?
P
philip
Sep 30, 2005
In article wrote:
If downloading digital photos stalls your PC, spare a thought for the data networks in hospitals. A midsize hospital typically gets 60 requests every hour for MRIs and echocardiograms. At 10 megabytes apiece, the enormous images can quickly cripple a network.

10 megabytes only takes about 1 second on 100 Mbps ethernet. Is that a big deal?

Anyhow, a 10 megabyte jpeg is probably more than 50 Mpixels. I don’t know what kind of viewing devices they have in hospitals, but starting out with lower resolution images and then getting high res crops from the real image (cropping at block boundaries is cheap in jpeg) strikes me as a good solution to reduce bandwidth.

Enter 25-year-old Arvind Thiagarajan, co-founder of Singapore-based startup MatrixView, who wants to revolutionize digital imaging. The data-compression algorithm he invented shrinks images into a format called a MatrixView Universal, or MVU, which is 15 to 300 percent smaller than a JPEG. But unlike a JPEG, which omits details, an MVU is as precise as the original. "Data loss is unacceptable in medical diagnosis," Thiagarajan says. That’s why the startup is focusing on health care first.

The usual snake-oil. Lossless compression doesn’t work all that well on images that contain noise. Any algorithm that deletes noise is also going to delete some image detail (unless the algorithm has some much domain specific knowledge that you can save just the ‘contents’ of the image and not the pixels.)


That was it. Done. The faulty Monk was turned out into the desert where it could believe what it liked, including the idea that it had been hard done by. It was allowed to keep its horse, since horses were so cheap to make. — Douglas Adams in Dirk Gently’s Holistic Detective Agency
RH
Ron Hunter
Sep 30, 2005
+/- wrote:
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock the DI world.
WHAT’S NEXT
Honey, I Shrunk the JPEG
By Shailaja Neelakantan, September 21, 2005
BUSINESS 2.0

If downloading digital photos stalls your PC, spare a thought for the data networks in hospitals. A midsize hospital typically gets 60 requests every hour for MRIs and echocardiograms. At 10 megabytes apiece, the enormous images can quickly cripple a network.

Enter 25-year-old Arvind Thiagarajan, co-founder of Singapore-based startup MatrixView, who wants to revolutionize digital imaging. The data-compression algorithm he invented shrinks images into a format called a MatrixView Universal, or MVU, which is 15 to 300 percent smaller than a JPEG. But unlike a JPEG, which omits details, an MVU is as precise as the original. "Data loss is unacceptable in medical diagnosis," Thiagarajan says. That’s why the startup is focusing on health care first. A well-known hospital in Bangalore is using the technology, and MatrixView plans to ink deals in the coming year with several Fortune 100 health-care companies in the United States. MatrixView is also targeting other subsets of the $9 billion U.S. digital-imaging market. Right now it’s negotiating with chipmakers to embed the technology in cameras and fit more files on storage cards. MRIs today, vacation snaps tomorrow.

http://www.business2.com/b2/web/articles/0,17863,1106847,00. html
http://www.matrixview.com/

Download white paper:
http://matrixview.com/files/ABO%20white%20paper.pdf
Well, let me know with Irfanview has it, and PhotoShop adopts it, THEN I will be impressed.
BTW, NO MENTION was made of color!


Ron Hunter
NS
Nicholas Sherlock
Sep 30, 2005
+/- wrote:
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock the DI world.

My bullshit senses are tingling.

Cheers,
Nicholas Sherlock
RH
Ron Hunter
Sep 30, 2005
Randy Berbaum wrote:
In rec.photo.digital Matt Ion wrote:
: Unless it’s freely available to all developers, and doesn’t include some : cockeyed protection scheme that makes it difficult for one to : backup/edit/copy one’s own pictures, it’ll never fly for the mass market.
Also nothing is said about processing time. If a sufficiently complex program is run, it is very possible to compress any photo much further than any currently used photo image. For the compression of an x-ray or MRI is "fast" if the result shows up in a few min while the dr and patient are walking back to the dr’s office. But how many of us are going to be happy waiting 2 min between shots on our digital camera, just to save 1/3 to 1/2 the memory space. Personally I think that I would rather purchase more memory than to have to wait several min (or even 10’s of seconds) between normal shots. JMHO

Now, if such a program were developed for archiving photos in more compact but lossless forms, It could have a big impact.

Randy

==========
Randy Berbaum
Champaign, IL
Moreover, the math in the quote leaves much to be desired. A 10 megabyte image across a gigabit ethernet connection takes less than 1 second to transmit. 60 of those an hour is hardly a significant network load.
Then there is the aspect that MRI’s are NOT COLOR. I am sure that going back to B&W is not an option for most of us.


Ron Hunter
K
Kingdom
Sep 30, 2005
"+/-" wrote in news::

Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock the DI world.

WHAT’S NEXT
Honey, I Shrunk the JPEG
By Shailaja Neelakantan, September 21, 2005
BUSINESS 2.0

If downloading digital photos stalls your PC, spare a thought for the
data networks in hospitals. A midsize hospital typically gets 60 requests every hour for MRIs and echocardiograms. At 10 megabytes apiece, the enormous images can quickly cripple a network.
Enter 25-year-old Arvind Thiagarajan, co-founder of
Singapore-based
startup MatrixView, who wants to revolutionize digital imaging. The data-compression algorithm he invented shrinks images into a format called a MatrixView Universal, or MVU, which is 15 to 300 percent smaller than a JPEG. But unlike a JPEG, which omits details, an MVU is as precise as the original. "Data loss is unacceptable in medical diagnosis," Thiagarajan says. That’s why the startup is focusing on health care first. A well-known hospital in Bangalore is using the technology, and MatrixView plans to ink deals in the coming year with several Fortune 100 health-care companies in the United States. MatrixView is also targeting other subsets of the $9 billion U.S. digital-imaging market. Right now it’s negotiating with chipmakers to embed the technology in cameras and fit more files on storage cards. MRIs today, vacation snaps tomorrow.

http://www.business2.com/b2/web/articles/0,17863,1106847,00. html
http://www.matrixview.com/

Download white paper:
http://matrixview.com/files/ABO%20white%20paper.pdf

Doubt well ever even see this format nevermind use it, if they want cash from hospitals they real are greedy bastards and it’s about 2 years too late we now have high speed everything!


f=Ma well, nearly…
B
birp2211
Sep 30, 2005
"Randy Berbaum" wrote in message
In rec.photo.digital Matt Ion wrote:
: Unless it’s freely available to all developers, and doesn’t include some : cockeyed protection scheme that makes it difficult for one to : backup/edit/copy one’s own pictures, it’ll never fly for the mass market.
Also nothing is said about processing time. If a sufficiently complex program is run, it is very possible to compress any photo much further than any currently used photo image.

ABO’s feautre (according to the page) is speed. It involves nothing but integer manipulations.

For the compression of an x-ray or
MRI is "fast" if the result shows up in a few min while the dr and patient are walking back to the dr’s office. But how many of us are going to be happy waiting 2 min between shots on our digital camera, just to save 1/3 to 1/2 the memory space. Personally I think that I would rather purchase more memory than to have to wait several min (or even 10’s of seconds) between normal shots. JMHO

Now, if such a program were developed for archiving photos in more compact but lossless forms, It could have a big impact.

Well, FWIW, that’s what ABO seems to offer.

Randy

==========
Randy Berbaum
Champaign, IL
B
birp2211
Sep 30, 2005
"Nicholas Sherlock" wrote in message
+/- wrote:
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock
the DI world.

My bullshit senses are tingling.

Why?

Perhaps the forecast of doom is premature, but better algorithms for compression aren’t technically impossible.

Not like MP3 hasn’t been bested from numerous angles.

Perhaps you simply like to express negativity to new ideas /just because/?
BC
Bruce Coryell
Sep 30, 2005
Matt Ion wrote:
Unless it’s freely available to all developers, and doesn’t include some cockeyed protection scheme that makes it difficult for one to backup/edit/copy one’s own pictures, it’ll never fly for the mass market.
+/- wrote:

Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock the DI world.
WHAT’S NEXT
Honey, I Shrunk the JPEG
By Shailaja Neelakantan, September 21, 2005
BUSINESS 2.0

If downloading digital photos stalls your PC, spare a thought for the
data networks in hospitals. A midsize hospital typically gets 60 requests every hour for MRIs and echocardiograms. At 10 megabytes apiece, the enormous images can quickly cripple a network.

Enter 25-year-old Arvind Thiagarajan, co-founder of Singapore-based startup MatrixView, who wants to revolutionize digital imaging. The data-compression algorithm he invented shrinks images into a format called a
MatrixView Universal, or MVU, which is 15 to 300 percent smaller than a JPEG. But unlike a JPEG, which omits details, an MVU is as precise as the original. "Data loss is unacceptable in medical diagnosis," Thiagarajan says. That’s why the startup is focusing on health care first. A well-known
hospital in Bangalore is using the technology, and MatrixView plans to ink
deals in the coming year with several Fortune 100 health-care companies in
the United States. MatrixView is also targeting other subsets of the $9 billion U.S. digital-imaging market. Right now it’s negotiating with chipmakers to embed the technology in cameras and fit more files on storage
cards. MRIs today, vacation snaps tomorrow.

http://www.business2.com/b2/web/articles/0,17863,1106847,00. html
http://www.matrixview.com/

Download white paper:
http://matrixview.com/files/ABO%20white%20paper.pdf


avast! Antivirus: Outbound message clean.
Virus Database (VPS): 0539-2, 09/29/2005
Tested on: 9/29/2005 11:16:05 PM
avast! – copyright (c) 1988-2005 ALWIL Software.
http://www.avast.com

Remember how Foveon was supposed to revolutionize digital photography and what a zero it turned out to be… All of us poor sods using conventional CCD’s were supposed to have hopelessly obsolete equipment by now.
CB
Chris Brown
Sep 30, 2005
In article <ZI8%e.161$>,
Bruce Coryell wrote:
Remember how Foveon was supposed to revolutionize digital photography and what a zero it turned out to be… All of us poor sods using conventional CCD’s were supposed to have hopelessly obsolete equipment by now.

Sssh! You’ll wake *him* up…
CS
Charlie Self
Sep 30, 2005
Nicholas Sherlock wrote:
+/- wrote:
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock the DI world.

My bullshit senses are tingling.

Yup. Let’s stick it in the catapult and see if it flies when it reaches the end of the shot.
S
Stewy
Sep 30, 2005
In article ,
(Philip Homburg) wrote:

In article wrote:
If downloading digital photos stalls your PC, spare a thought for the data networks in hospitals. A midsize hospital typically gets 60 requests every hour for MRIs and echocardiograms. At 10 megabytes apiece, the enormous images can quickly cripple a network.

10 megabytes only takes about 1 second on 100 Mbps ethernet. Is that a big deal?

Depends. Exactly how fast is your broadband? I’m on a LAN rated at 10mbps but do I ever get that? Well, if everyone else on the LAN logged off, then maybe yes. As it is, I’m lucky to get 100kbps for either music downloads or binaries.
So data compression without loss (I’m assuming these are B&W/false color – ie 32 or 256 colors) IS very useful. How it deals with full color JPEGs is another matter.
Anyhow, a 10 megabyte jpeg is probably more than 50 Mpixels. I don’t know what kind of viewing devices they have in hospitals, but starting out with lower resolution images and then getting high res crops from the real image (cropping at block boundaries is cheap in jpeg) strikes me as a good solution to reduce bandwidth.

Enter 25-year-old Arvind Thiagarajan, co-founder of Singapore-based startup MatrixView, who wants to revolutionize digital imaging. The data-compression algorithm he invented shrinks images into a format called a MatrixView Universal, or MVU, which is 15 to 300 percent smaller than a JPEG. But unlike a JPEG, which omits details, an MVU is as precise as the original. "Data loss is unacceptable in medical diagnosis," Thiagarajan says. That’s why the startup is focusing on health care first.

The usual snake-oil. Lossless compression doesn’t work all that well on images that contain noise. Any algorithm that deletes noise is also going to delete some image detail (unless the algorithm has some much domain specific knowledge that you can save just the ‘contents’ of the image and not the pixels.)
V
veldy71
Sep 30, 2005
In rec.photo.digital Randy Berbaum wrote:
Now, if such a program were developed for archiving photos in more compact but lossless forms, It could have a big impact.

Adobe DNG does a pretty good job with RAW file storage. It shrinks my NEF files from my D70 by about 25% [rough estimate].


Thomas T. Veldhouse
Key Fingerprint: 2DB9 813F F510 82C2 E1AE 34D0 D69D 1EDC D5EC AED1 Spammers please contact me at
MR
Mark Roberts
Sep 30, 2005
"Charlie Self" wrote:

Nicholas Sherlock wrote:
+/- wrote:
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock the DI world.

My bullshit senses are tingling.

Yup. Let’s stick it in the catapult and see if it flies when it reaches the end of the shot.

The "15 to 300 percent smaller than a JPEG" certainly sets off alarm bells. 15 to 300 percent smaller than what JPEG quality level? And with what *kind* of image (in terms of content): This has an influence an how effective JPEG compression is.

Perhaps, a "naive, uninformed reporter" detector or "overhyping press release" detector might be a better term than "bullshit senses", but suspicion is certainly merited.


Mark Roberts
Photography and writing
www.robertstech.com
N
none
Sep 30, 2005
*– Jinn –* wrote:
Perhaps the forecast of doom is premature, but better algorithms for compression aren’t technically impossible.

Yes, they are. Huffman coding
(http://en.wikipedia.org/wiki/Huffman_coding) provably gives the most efficient result for lossless compression. This is the algorithm that is used for TIFF.

If you managed to find a general method for loslessly compressing bitmapped images to 30% the size of a JPEG, you’d get more attention than a 150-word press release on some no-name website.

-Mike
M
mark
Sep 30, 2005
Bruce Coryell wrote:
Matt Ion wrote:
Unless it’s freely available to all developers, and doesn’t include some cockeyed protection scheme that makes it difficult for one to backup/edit/copy one’s own pictures, it’ll never fly for the mass market.
+/- wrote:

Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock the DI world.
WHAT’S NEXT
Honey, I Shrunk the JPEG
By Shailaja Neelakantan, September 21, 2005
BUSINESS 2.0

If downloading digital photos stalls your PC, spare a thought for the
data networks in hospitals. A midsize hospital typically gets 60 requests every hour for MRIs and echocardiograms. At 10 megabytes apiece, the enormous images can quickly cripple a network.

Enter 25-year-old Arvind Thiagarajan, co-founder of Singapore-based startup MatrixView, who wants to revolutionize digital imaging. The data-compression algorithm he invented shrinks images into a format called a
MatrixView Universal, or MVU, which is 15 to 300 percent smaller than a JPEG. But unlike a JPEG, which omits details, an MVU is as precise as the original. "Data loss is unacceptable in medical diagnosis," Thiagarajan says. That’s why the startup is focusing on health care first. A well-known
hospital in Bangalore is using the technology, and MatrixView plans to ink
deals in the coming year with several Fortune 100 health-care companies in
the United States. MatrixView is also targeting other subsets of the $9 billion U.S. digital-imaging market. Right now it’s negotiating with chipmakers to embed the technology in cameras and fit more files on storage
cards. MRIs today, vacation snaps tomorrow.

http://www.business2.com/b2/web/articles/0,17863,1106847,00. html
http://www.matrixview.com/

Download white paper:
http://matrixview.com/files/ABO%20white%20paper.pdf


avast! Antivirus: Outbound message clean.
Virus Database (VPS): 0539-2, 09/29/2005
Tested on: 9/29/2005 11:16:05 PM
avast! – copyright (c) 1988-2005 ALWIL Software.
http://www.avast.com

Remember how Foveon was supposed to revolutionize digital photography and what a zero it turned out to be… All of us poor sods using conventional CCD’s were supposed to have hopelessly obsolete equipment by now.

did it?
R
Roberto
Sep 30, 2005
If you managed to find a general method for loslessly compressing bitmapped images to 30% the size of a JPEG, you’d get more attention than a 150-word press release on some no-name website.

And further, methods can be patented (if they meet the prerequisites of not being public earlier, etc. etc.) so look to the patent office. Regardless, patents do not require that the method be proven to be better, just unique. I can check that out from work later.

The so-called white paper is topical, not adequate to tell what the author is really doing. References to symbolic representations look like nothing but adaptive compression schemes in the same file. Nothing new there in the research community.

I look forward to authoritative reviews in the journals.

Now it’s time to go to the day job on T2, 100mb desktop machines, fiber optic backbones and one fast courier who can carry a few terabytes of images in his arms up the elevator faster than God.
V
veldy71
Sep 30, 2005
In rec.photo.digital none wrote:
If you managed to find a general method for loslessly compressing bitmapped images to 30% the size of a JPEG, you’d get more attention than a 150-word press release on some no-name website.

Absolutely correct!


Thomas T. Veldhouse
Key Fingerprint: 2DB9 813F F510 82C2 E1AE 34D0 D69D 1EDC D5EC AED1 Spammers please contact me at
P
philip
Sep 30, 2005
In article ,
Stewy wrote:
In article ,
(Philip Homburg) wrote:

In article wrote:
If downloading digital photos stalls your PC, spare a thought for the data networks in hospitals. A midsize hospital typically gets 60 requests every hour for MRIs and echocardiograms. At 10 megabytes apiece, the enormous images can quickly cripple a network.

10 megabytes only takes about 1 second on 100 Mbps ethernet. Is that a big deal?

Depends. Exactly how fast is your broadband?

At work, I can get 100 Mbps when I need it. The backbone tends to be fast enough.

As it is, I’m lucky to get 100kbps for either music
downloads or binaries.

That means that you have a completely obsolete backbone.

(At home, I have about 3Mbps down (and the ISP’s network has enough capacity), so a single 10 MByte image takes about 30 seconds.)

So data compression without loss (I’m assuming these are B&W/false color – ie 32 or 256 colors) IS very useful. How it deals with full color JPEGs is another matter.

There is not going to be any lossless compression that works better than jpeg (without using domain specific knowledge). If you have images that are suitable for lossless compression, compress them with png. For grayscale images, compressing a TIFF with bzip2 may also work.


That was it. Done. The faulty Monk was turned out into the desert where it could believe what it liked, including the idea that it had been hard done by. It was allowed to keep its horse, since horses were so cheap to make. — Douglas Adams in Dirk Gently’s Holistic Detective Agency
R
Roberto
Sep 30, 2005
Will you people who reply with lame one-liners please SNIP THE ARTICLE? (mark)?
SP
Stephen Poley
Sep 30, 2005
On Fri, 30 Sep 2005 08:10:03 -0500, "Lorem Ipsum" wrote:

If you managed to find a general method for loslessly compressing bitmapped images to 30% the size of a JPEG, you’d get more attention than a 150-word press release on some no-name website.

And further, methods can be patented (if they meet the prerequisites of not being public earlier, etc. etc.) so look to the patent office. Regardless, patents do not require that the method be proven to be better, just unique. I can check that out from work later.

The so-called white paper is topical, not adequate to tell what the author is really doing. References to symbolic representations look like nothing but adaptive compression schemes in the same file.

That’s being kind. It looks like snake-oil to me. For example the references to the OSI model say to me that the author neither understands the OSI model nor wants to.

There was a similar case of somebody making ludicrous compression claims in the Netherlands a year or so back. I don’t think anyone managed to discover whether the author was a con-man or merely deluded, but he certainly didn’t have anything workable. (He is now dead, so I guess we’ll never know.) It seems to be this decade’s perpetual motion machine.


Stephen Poley
MI
Matt Ion
Sep 30, 2005
Trevor wrote:

<big snip>

Download white paper:
http://matrixview.com/files/ABO%20white%20paper.pdf

data formats are like standards – there are so many to choose from. I woudl rather put my trust in JPEG2000, but that’s taking its time getting to the masses – anyone up to date on the Lizardtech claims?

That’s a perfect example: even the wonderful fre InfanView has only very very limited support for JPEG2000 because the plugin must be paid for. 99.9% of users are going to have no need for the format’s extra features/capabilities that are going to be worth actually paying for the support, especially when regular JPG is more than sufficient.


avast! Antivirus: Outbound message clean.
Virus Database (VPS): 0539-2, 09/29/2005
Tested on: 9/30/2005 7:45:44 AM
avast! – copyright (c) 1988-2005 ALWIL Software.
http://www.avast.com
T
toby
Sep 30, 2005
*– Jinn –* wrote:
"Nicholas Sherlock" wrote in message
+/- wrote:
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock
the DI world.

My bullshit senses are tingling.

Why?

Press releases that begin in that vein tend to end in this one: http://www.smh.com.au/news/Business/Inventor-faces-his-angry -creditors/2004/12/14/1102787085195.html

Perhaps the forecast of doom is premature, but better algorithms for compression aren’t technically impossible.

Not like MP3 hasn’t been bested from numerous angles.

Perhaps you simply like to express negativity to new ideas /just because/?
T
toby
Sep 30, 2005
toby wrote:
*– Jinn –* wrote:
"Nicholas Sherlock" wrote in message
+/- wrote:
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock
the DI world.

My bullshit senses are tingling.

Why?

Press releases that begin in that vein tend to end in this one: http://www.smh.com.au/news/Business/Inventor-faces-his-angry -creditors/2004/12/14/1102787085195.html

Better article @
http://www.theage.com.au/articles/2004/09/20/1095651251085.h tml?from=storylhs
G
gthart
Sep 30, 2005
"Mark Roberts" wrote in message
The "15 to 300 percent smaller than a JPEG" certainly sets off alarm bells.

BTW how can anything be more than 100% smaller?
100% smaller = 0

Gerrit
BP
Barry Pearson
Sep 30, 2005
Thomas T. Veldhouse wrote:
In rec.photo.digital Randy Berbaum wrote:
Now, if such a program were developed for archiving photos in more compact but lossless forms, It could have a big impact.

Adobe DNG does a pretty good job with RAW file storage. It shrinks my NEF files from my D70 by about 25% [rough estimate].

DNG uses lossless JPEG compression. That indicates that it can’t be nearly as much as a lossy JPEG compression. So it is unlikely to be able to compete with this new form, assuming the statement about it is accurate. But I am sceptical about whether this is really a lossless compression.


Barry Pearson
http://www.barry.pearson.name/photography/
http://www.birdsandanimals.info/
RK
Richard Kettlewell
Sep 30, 2005
none writes:
*– Jinn –* wrote:

Perhaps the forecast of doom is premature, but better algorithms for compression aren’t technically impossible.

Yes, they are. Huffman coding
(http://en.wikipedia.org/wiki/Huffman_coding) provably gives the most efficient result for lossless compression. This is the algorithm that is used for TIFF.

Err, sort of; it gives the most efficient result if you are constrained to map each input symbol to the same output bit pattern. That’s hardly a universal constraint.

If you managed to find a general method for loslessly compressing bitmapped images to 30% the size of a JPEG, you’d get more attention than a 150-word press release on some no-name website.

This, however, is true.


http://www.greenend.org.uk/rjk/
RW
Roger Whitehead
Sep 30, 2005
Which means you get a negative image. Quite common in photography. 😎



Roger
D
davem
Sep 30, 2005
Stewy writes:

10 megabytes only takes about 1 second on 100 Mbps ethernet. Is that a big deal?

Depends. Exactly how fast is your broadband? I’m on a LAN rated at 10mbps but do I ever get that? Well, if everyone else on the LAN logged off, then maybe yes. As it is, I’m lucky to get 100kbps for either music downloads or binaries.

The port coming out of your cable modem is 10 Mbps, but the cable’s maximum useful bandwidth (and the modem’s maximum capability) is a fraction of that. And you do have to share it with your neighbours because there’s only one wire.

Any hospital installing bargain-basement equipment today would get at least 100 Mbps hardware and switches not hubs. That can actually sustain at least 50 Mbps of data transfer, and many transfers can be in progress at the same time because of the switches as long as they use different paths. 10 MB images are not a problem.

Dave
PT
Peter Twydell
Sep 30, 2005
In message
<433d58d5$0$6560$>, Gerrit ‘t
Hart writes
"Mark Roberts" wrote in message
The "15 to 300 percent smaller than a JPEG" certainly sets off alarm bells.

BTW how can anything be more than 100% smaller?
100% smaller = 0

Gerrit
You beat me to it. This is sheer nonsense, along the same lines as journalists writing "three times smaller" when they mean (I think) "one-third as big", or "300% bigger" when there’s only a 200% increase. And another thing… (rant, mutter, mumble)

Peter

Ying tong iddle-i po!
KW
Ken Weitzel
Sep 30, 2005
Peter Twydell wrote:
In message
<433d58d5$0$6560$>, Gerrit ‘t
Hart writes

"Mark Roberts" wrote in message

The "15 to 300 percent smaller than a JPEG" certainly sets off alarm bells.

BTW how can anything be more than 100% smaller?
100% smaller = 0

Gerrit
You beat me to it. This is sheer nonsense, along the same lines as journalists writing "three times smaller" when they mean (I think) "one-third as big", or "300% bigger" when there’s only a 200% increase. And another thing… (rant, mutter, mumble)

I’m giving 110% effort here, but still confused 🙂

Ken
P
PcB
Sep 30, 2005
<<The data-compression algorithm he invented shrinks images into a format called a MatrixView Universal, or MVU, which is 15 to 300 percent smaller than a JPEG.

Er, how can you make something 300% smaller? 100% is all of it ….


Paul ============}
o o

// Live fast, die old //
PaulsPages are at http://homepage.ntlworld.com/pcbradley/
T
toby
Sep 30, 2005
none wrote:
*– Jinn –* wrote:
Perhaps the forecast of doom is premature, but better algorithms for compression aren’t technically impossible.

Yes, they are. Huffman coding
(http://en.wikipedia.org/wiki/Huffman_coding) provably gives the most efficient result for lossless compression. This is the algorithm that is used for TIFF.

Huffman is not particularly effective except for bilevel (1-bit) images. The LZW family of algorithms in particular perform better in general. TIFF uses LZW, ZIP and varieties of RLE (such as Apple Packbits), in addition to the CCITT Huffman-based methods defined for faxes.

References:
TIFF standard,
http://www.digitalpreservation.gov/formats/fdd/fdd000022.sht ml LZW Explained, http://www.danbbs.dk/~dino/whirlgif/lzw.html Intro to Data Compression,
http://www.faqs.org/faqs/compression-faq/part2/section-1.htm l

If you managed to find a general method for loslessly compressing bitmapped images to 30% the size of a JPEG, you’d get more attention than a 150-word press release on some no-name website.

-Mike
E
eawckyegcy
Sep 30, 2005
*– Jinn –* wrote:

My bullshit senses are tingling.

Why?

Because there a large number of "data compression" claims that have later been shown to be bullshit (or, usually, failure on the part of the claimant to back up his claim). They are the perpetual motion machines of computation.

Perhaps you simply like to express negativity to new ideas /just because/?

More likely is that you are just ignorant of the history of these things.
G
Gormless
Sep 30, 2005
"+/-" wrote in message
data networks in hospitals. A midsize hospital typically gets 60 requests every hour for MRIs and echocardiograms. At 10 megabytes apiece, the enormous images can quickly cripple a network.

If 10 megabytes a minute can cripple a hospital network then I don’t think much of their networks.
And since when was a 10 Mb image ‘enormous’?
L
Larry
Oct 1, 2005
On Fri, 30 Sep 2005 00:14:55 -0400, "+/-" wrote:

Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock the DI world.
WHAT’S NEXT
Honey, I Shrunk the JPEG
By Shailaja Neelakantan, September 21, 2005
BUSINESS 2.0

If downloading digital photos stalls your PC, spare a thought for the data networks in hospitals. A midsize hospital typically gets 60 requests every hour for MRIs and echocardiograms. At 10 megabytes apiece, the enormous images can quickly cripple a network.

Enter 25-year-old Arvind Thiagarajan, co-founder of Singapore-based startup MatrixView, who wants to revolutionize digital imaging. The data-compression algorithm he invented shrinks images into a format called a MatrixView Universal, or MVU, which is 15 to 300 percent smaller than a JPEG. But unlike a JPEG, which omits details, an MVU is as precise as the original.

B.S.
PJ
Paul J Gans
Oct 1, 2005
In rec.photo.digital Randy Berbaum wrote:
In rec.photo.digital Matt Ion wrote:
: Unless it’s freely available to all developers, and doesn’t include some : cockeyed protection scheme that makes it difficult for one to : backup/edit/copy one’s own pictures, it’ll never fly for the mass market.

Also nothing is said about processing time. If a sufficiently complex program is run, it is very possible to compress any photo much further than any currently used photo image. For the compression of an x-ray or MRI is "fast" if the result shows up in a few min while the dr and patient are walking back to the dr’s office. But how many of us are going to be happy waiting 2 min between shots on our digital camera, just to save 1/3 to 1/2 the memory space. Personally I think that I would rather purchase more memory than to have to wait several min (or even 10’s of seconds) between normal shots. JMHO

Now, if such a program were developed for archiving photos in more compact but lossless forms, It could have a big impact.

I agree. But that is a serious major use. With an
ever larger pixel count it is getting to the point
where I will be up to my navel in DVDs with images
on them.

—– Paul J. Gans
R
rlhaar
Oct 1, 2005
On 2005/9/30 6:17 AM, "*– Jinn –*" wrote:

"Nicholas Sherlock" wrote in message
+/- wrote:
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock
the DI world.

My bullshit senses are tingling.

Why?

For one thing, the original post cannot be mathematically correct. A 300 percent reduction in size would make the file size negative.
T
toby
Oct 1, 2005
Paul J Gans wrote:


Now, if such a program were developed for archiving photos in more compact but lossless forms, It could have a big impact.

I agree. But that is a serious major use. With an
ever larger pixel count it is getting to the point
where I will be up to my navel in DVDs with images
on them.

That’s odd – because my folders of 35mm/120 negatives, which carry considerably more information than any JPEG I’ve seen – take up relatively little space.

So much for the revolution.

—– Paul J. Gans
NS
Nicholas Sherlock
Oct 1, 2005
Paul J Gans wrote:
I agree. But that is a serious major use. With an
ever larger pixel count it is getting to the point
where I will be up to my navel in DVDs with images
on them.

But storage densities are increasing too. When blu-ray DVDs come out (And more technology after that) it will be less of a problem.

Cheers,
Nicholas Sherlock
CS
Charlie Self
Oct 1, 2005
Nicholas Sherlock wrote:
Paul J Gans wrote:
I agree. But that is a serious major use. With an
ever larger pixel count it is getting to the point
where I will be up to my navel in DVDs with images
on them.

But storage densities are increasing too. When blu-ray DVDs come out (And more technology after that) it will be less of a problem.

25 to 50 gigs. Not bad. My only fear is that by the time Blu Ray is affordable, we’ll be staring at 100 MP cameras. There goes the advance in storage.
NS
Nicholas Sherlock
Oct 1, 2005
Charlie Self wrote:
Nicholas Sherlock wrote:

Paul J Gans wrote:

I agree. But that is a serious major use. With an
ever larger pixel count it is getting to the point
where I will be up to my navel in DVDs with images
on them.

But storage densities are increasing too. When blu-ray DVDs come out (And more technology after that) it will be less of a problem.

25 to 50 gigs. Not bad. My only fear is that by the time Blu Ray is affordable, we’ll be staring at 100 MP cameras. There goes the advance in storage.

But if they increase together at the same rate, it’ll never be any worse than it is right now :).

Cheers,
Nicholas Sherlock
AH
Andrew Haley
Oct 1, 2005
In rec.photo.digital toby wrote:

Paul J Gans wrote:


Now, if such a program were developed for archiving photos in more compact but lossless forms, It could have a big impact.

I agree. But that is a serious major use. With an
ever larger pixel count it is getting to the point
where I will be up to my navel in DVDs with images
on them.

That’s odd – because my folders of 35mm/120 negatives, which carry considerably more information than any JPEG I’ve seen – take up relatively little space.

Relative to what? Colour film is about 100kbytes/mm^3, whereas current hard disc drives are at about 1.5 Mbytes/mm^3.

Andrew.
SM
Skip Middleton
Oct 1, 2005
"toby" wrote in message
Paul J Gans wrote:


Now, if such a program were developed for archiving photos in more compact
but lossless forms, It could have a big impact.

I agree. But that is a serious major use. With an
ever larger pixel count it is getting to the point
where I will be up to my navel in DVDs with images
on them.

That’s odd – because my folders of 35mm/120 negatives, which carry considerably more information than any JPEG I’ve seen – take up relatively little space.

So much for the revolution.

A 300gig external hard drive is a whole lot smaller than the cases for 300 slides.


Skip Middleton
http://www.shadowcatcherimagery.com
R
Roberto
Oct 1, 2005
"Andrew Haley" wrote in message
In rec.photo.digital toby wrote:

That’s odd – because my folders of 35mm/120 negatives, which carry considerably more information than any JPEG I’ve seen – take up relatively little space.

Relative to what? Colour film is about 100kbytes/mm^3, whereas current hard disc drives are at about 1.5 Mbytes/mm^3.

So the thread degenerates to minutiae. Unplug your drive. Now measure. See how it works?
T
toby
Oct 1, 2005
Andrew Haley wrote:
In rec.photo.digital toby wrote:

Paul J Gans wrote:


Now, if such a program were developed for archiving photos in more compact but lossless forms, It could have a big impact.

I agree. But that is a serious major use. With an
ever larger pixel count it is getting to the point
where I will be up to my navel in DVDs with images
on them.

That’s odd – because my folders of 35mm/120 negatives, which carry considerably more information than any JPEG I’ve seen – take up relatively little space.

Relative to what? Colour film is about 100kbytes/mm^3, whereas current hard disc drives are at about 1.5 Mbytes/mm^3.

Yes, that’s why I said it the remark was "odd" – since the density is remarkably different. I’m not up to my navel in film negatives. But then maybe the poster is storing the Bettmann Archive.

(Based on specs for a typical 200GB SATA drive such as
http://www.seagate.com/cda/products/discsales/marketing/deta il/0,1081,599,00.html I compute hard drive density at 513,349 bytes/mm^3. I suppose a 600GB drive in the same form factor would equate to roughly your figure.)

At (uncompressed) 24-bit, I empirically measure film at a minimum of 17Kb x mm^2 (typically higher), and assuming 0.11mm base and 10% packing waste, I compute a minimum density of 140Kb/mm^3 at 24 bit. This does not take into account continuous-tone vs. 8-bit quantisation (if we assume 16 bit samples, density would be more like 280Kb/mm^3, which is interestingly enough, only half current hard drive density). More scientific data at http://medfmt.8k.com/mf/filmwins.html

(Based on assessment of many drum scans, I put low-end 35mm at approx 15 Mp equivalent, 6x7cm at a minimum of 73 Mp. But that’s a whole different war. Not to mention 4×5" and 8×10"…)

Andrew.
RL
Rainer Latka
Oct 1, 2005
Ron Hunter schrieb am Freitag, 30. September 2005 10:18:

[…]
Moreover, the math in the quote leaves much to be desired. A 10 megabyte image across a gigabit ethernet connection takes less than 1 second to transmit. 60 of those an hour is hardly a significant network load.
Then there is the aspect that MRI’s are NOT COLOR. I am sure that going back to B&W is not an option for most of us.

provided the algorithm is really lossless, one could of course compress the RGB channels individually…
K
KatWoman
Oct 1, 2005
"Look out the sky is falling"
Chicken Little

"+/-" wrote in message
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock the DI world.
WHAT’S NEXT
Honey, I Shrunk the JPEG
By Shailaja Neelakantan, September 21, 2005
BUSINESS 2.0

If downloading digital photos stalls your PC, spare a thought for the data networks in hospitals. A midsize hospital typically gets 60 requests every hour for MRIs and echocardiograms. At 10 megabytes apiece, the enormous images can quickly cripple a network.

snip
cards. MRIs today, vacation snaps tomorrow.

http://www.business2.com/b2/web/articles/0,17863,1106847,00. html
http://www.matrixview.com/

Download white paper:
http://matrixview.com/files/ABO%20white%20paper.pdf

T
toby
Oct 2, 2005
Skip M wrote:
"toby" wrote in message
Paul J Gans wrote:


Now, if such a program were developed for archiving photos in more compact
but lossless forms, It could have a big impact.

I agree. But that is a serious major use. With an
ever larger pixel count it is getting to the point
where I will be up to my navel in DVDs with images
on them.

That’s odd – because my folders of 35mm/120 negatives, which carry considerably more information than any JPEG I’ve seen – take up relatively little space.

So much for the revolution.

A 300gig external hard drive is a whole lot smaller than the cases for 300 slides.

I imagine I could fit a lot more than 300 slides into the space of 70 DVDs (which Paul said had reached his navel, and they probably would, stacked from the floor). But possibly not enough slides to fill 300GB (say 6000 uncompressed 35mm scans :). Bottom line is, should Paul be archiving on hard disk?


Skip Middleton
http://www.shadowcatcherimagery.com
SM
Skip Middleton
Oct 2, 2005
"toby" wrote in message
Skip M wrote:
"toby" wrote in message
Paul J Gans wrote:


Now, if such a program were developed for archiving photos in more compact
but lossless forms, It could have a big impact.

I agree. But that is a serious major use. With an
ever larger pixel count it is getting to the point
where I will be up to my navel in DVDs with images
on them.

That’s odd – because my folders of 35mm/120 negatives, which carry considerably more information than any JPEG I’ve seen – take up relatively little space.

So much for the revolution.

A 300gig external hard drive is a whole lot smaller than the cases for 300
slides.

I imagine I could fit a lot more than 300 slides into the space of 70 DVDs (which Paul said had reached his navel, and they probably would, stacked from the floor). But possibly not enough slides to fill 300GB (say 6000 uncompressed 35mm scans :). Bottom line is, should Paul be archiving on hard disk?


I would say emphatically, "Hell Yes!" An external HD is less likely to be affected by the vagaries of the main computer, too.


Skip Middleton
http://www.shadowcatcherimagery.com
J
JohnR66
Oct 2, 2005
"Robert L. Haar" wrote in message
On 2005/9/30 6:17 AM, "*– Jinn –*" wrote:

"Nicholas Sherlock" wrote in message
+/- wrote:
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no
compromise technology. Young genius about to rock
the DI world.

My bullshit senses are tingling.

Why?

For one thing, the original post cannot be mathematically correct. A 300 percent reduction in size would make the file size negative.
I like the idea. Put files on the drive and you get more space!
RL
Rainer Latka
Oct 2, 2005
Skip M schrieb am Sonntag, 2. Oktober 2005 06:58:
"toby" wrote in message
[…]
I imagine I could fit a lot more than 300 slides into the space of 70 DVDs (which Paul said had reached his navel, and they probably would, stacked from the floor). But possibly not enough slides to fill 300GB (say 6000 uncompressed 35mm scans :). Bottom line is, should Paul be archiving on hard disk?


I would say emphatically, "Hell Yes!" An external HD is less likely to be affected by the vagaries of the main computer, too.

how come? You’ll have to connect it to read/write on it, so any malicious SW will reach it. And when disconnected, the risk of being dropped is certainly higher than with a built-in disk
T
toby
Oct 2, 2005
Rainer Latka wrote:
Skip M schrieb am Sonntag, 2. Oktober 2005 06:58:
"toby" wrote in message
[…]
I imagine I could fit a lot more than 300 slides into the space of 70 DVDs (which Paul said had reached his navel, and they probably would, stacked from the floor). But possibly not enough slides to fill 300GB (say 6000 uncompressed 35mm scans :). Bottom line is, should Paul be archiving on hard disk?


I would say emphatically, "Hell Yes!" An external HD is less likely to be affected by the vagaries of the main computer, too.

how come? You’ll have to connect it to read/write on it, so any malicious SW will reach it.

It can be made read-only easily enough.

And when disconnected, the risk of being
dropped is certainly higher than with a built-in disk

I could drop a folder of negatives in the bath, too. DVD-Rs are very fragile media, I’d sooner use a hermetic metal case.
R
Roberto
Oct 2, 2005
While getting into the car yesterday, I dropped 4 gig of data onto the concrete drive. It was wrapped only in an acetate envelope. Zero damage. (8×10" negative).
G
ggull
Oct 2, 2005
"Robert L. Haar" < wrote in …
On 2005/9/30 6:17 AM, "*– Jinn –*" > wrote:
"Nicholas Sherlock" wrote …
+/- wrote:
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no
compromise technology. Young genius about to rock the DI world.

My bullshit senses are tingling.
Why?

For one thing, the original post cannot be mathematically correct. A 300 percent reduction in size would make the file size negative.

Also, anyone who (or whose press release) describes themselves as a "genius" raises the bs flag too. Especially an unknown genius.
K
kashe
Oct 2, 2005
On Fri, 30 Sep 2005 10:17:31 GMT, "*– Jinn –*" wrote:

"Nicholas Sherlock" wrote in message
+/- wrote:
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology. Young genius about to rock
the DI world.

My bullshit senses are tingling.

Why?

Perhaps the forecast of doom is premature, but better algorithms for compression aren’t technically impossible.

Not like MP3 hasn’t been bested from numerous angles.

Perhaps you simply like to express negativity to new ideas /just because/?

Perhaps you like to psychoanalyze people on the basis of a single usenet posting?

Is your license to practice current?
A
Alturas
Oct 3, 2005
On Fri, 30 Sep 2005 09:27:59 +0200, "Trevor" wrote:

<big snip>

Download white paper:
http://matrixview.com/files/ABO%20white%20paper.pdf

data formats are like standards – there are so many to choose from. I woudl rather put my trust in JPEG2000, but that’s taking its time getting to the masses – anyone up to date on the Lizardtech claims?

Yes, what is the deal with JPEG2000? Write times too slow or something? It’s nearly 2006 and we still don’t see it in digicams.

Alturas

—-== Posted via Newsfeeds.Com – Unlimited-Uncensored-Secure Usenet News==—- http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups —-= East and West-Coast Server Farms – Total Privacy via Encryption =—-
MJ
Mike Jacoubowsky
Oct 3, 2005
Moreover, the math in the quote leaves much to be desired. A 10 megabyte image across a gigabit ethernet connection takes less than 1 second to transmit. 60 of those an hour is hardly a significant network load. Then there is the aspect that MRI’s are NOT COLOR. I am sure that going back to B&W is not an option for most of us.

While it may be true that MRIs are not in color, I’ve been around hospital imaging equipment enough over the last couple of years to have noticed quite a bit of use of color on the various scanning equipment. It just ain’t a black & white world anymore (although I believe the "color" is artificially added as a visual aid, as I suspect most of the sensors are probably just recording essentially shades of gray).

–Mike– Chain Reaction Bicycles
www.ChainReactionBicycles.com

"Ron Hunter" wrote in message
Randy Berbaum wrote:
In rec.photo.digital Matt Ion wrote:
: Unless it’s freely available to all developers, and doesn’t include some : cockeyed protection scheme that makes it difficult for one to : backup/edit/copy one’s own pictures, it’ll never fly for the mass market.
Also nothing is said about processing time. If a sufficiently complex program is run, it is very possible to compress any photo much further than any currently used photo image. For the compression of an x-ray or MRI is "fast" if the result shows up in a few min while the dr and patient are walking back to the dr’s office. But how many of us are going to be happy waiting 2 min between shots on our digital camera, just to save 1/3 to 1/2 the memory space. Personally I think that I would rather purchase more memory than to have to wait several min (or even 10’s of seconds) between normal shots. JMHO

Now, if such a program were developed for archiving photos in more compact but lossless forms, It could have a big impact.

Randy

==========
Randy Berbaum
Champaign, IL
Moreover, the math in the quote leaves much to be desired. A 10 megabyte image across a gigabit ethernet connection takes less than 1 second to transmit. 60 of those an hour is hardly a significant network load. Then there is the aspect that MRI’s are NOT COLOR. I am sure that going back to B&W is not an option for most of us.


Ron Hunter
KS
Keith Sheppard
Oct 3, 2005
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology
The notion that any technological improvement necessarily heralds the end for that which went before it is usually flawed because it ignores the influence of marketing and commercial pressures.

Many years ago in the UK (and maybe elsewhere) we had two formats for video recordings – VHS and Betamax. Over the course of a few years, VHS eventually displaced Betamax and the latter sank without trace. Was this because VHS was better? No. In fact many people in the know say that Betamax was actually the technically superior. It was a marketing victory. The VHS camp managed to attract more manufacturers to their cause and eventually it became the de-facto standard.

A similar thing happened with sattelite TV broadcasting standards about ten years later.

In the case of compression algorithms IMHO it is already too late to make a significant dent in the grip which JPEG has on the market. The new compression algorithm may be the best thing since sliced bread but that alone won’t help it. There is an enormous amount of hardware and software out there which is tied to JPEG and JPEG is cemented in the semi-technical person’s mind as "the" standard for photographic image compression.

It will take more than a good algorithm to displace it. It will take commercial pressure and probably a not inconsiderable amount of money before it can make any headway.

Regards
Keith
M
mark
Oct 3, 2005
ok
M
mark
Oct 3, 2005
ok
CB
Chris Brown
Oct 3, 2005
In article <BT60f.2982$>,
Keith Sheppard wrote:
Finally, JPEG is doomed, algorithm geeks unite! This is the quantum leap, no compromise technology
The notion that any technological improvement necessarily heralds the end for that which went before it is usually flawed because it ignores the influence of marketing and commercial pressures.

Many years ago in the UK (and maybe elsewhere) we had two formats for video recordings – VHS and Betamax. Over the course of a few years, VHS eventually displaced Betamax and the latter sank without trace. Was this because VHS was better? No. In fact many people in the know say that Betamax was actually the technically superior.

Not this old chestnut. In general you’re right about superior technology not necessarilly leading to marketing success, but Beta vs VHS is a really bad example of that. VHS was superior where it mattered – it had the ability to store a full length movie at a time when Beta didn’t.

Arguably the best technology was actually the third standard, the one nobody ever mentions, Philips V2000.
AT
All Things Mopar
Oct 3, 2005
Today Keith Sheppard spoke these views with conviction for everyone’s edification:

Many years ago in the UK (and maybe elsewhere) we had two formats for video recordings – VHS and Betamax. Over the course of a few years, VHS eventually displaced Betamax and the latter sank without trace. Was this because VHS was better? No. In fact many people in the know say that
Betamax was actually the technically superior. It was a marketing victory. The VHS camp managed to attract more
manufacturers to their cause and eventually it became the de-facto standard.

VHS vs. Beta was world-wide. What really killed Beta was Sony’s inability to get enough tape into the smaller cartridge to compete with VHS SLP 6-hour recordings. The first Betamax units could only record for an hour, and I think the longest time I ever used was still around 4 or so.

I’ve also heard the quality issue, but personally never saw better quality with my many Betamax recorders before I was forced to switch to VHS. I’ve still got one of the last Sony units, which I used to occasionally look at some 100 old tapes of movies off cable TV.

It will take more than a good algorithm to displace it. It will take commercial pressure and probably a not
inconsiderable amount of money before it can make any
headway.

I agree. Another good example is MP3 for audio. I lost track of the standard when it went past MP15, but there’s no music nor any players that I know of that will recognize the newer formats, and I also don’t know or understand what is better about them.

There’s an old advertising saying that claims "nobody gets it until everybody wants it". That said, your thesis is right on the mark. It’ll be really tough for even a clearly superior compression algorithm to break through the tremendous
installed base of graphics apps that wouldn’t be able to read it for a couple of versions, if the company were even still in business to update legacy software.


ATM, aka Jerry
G
ggull
Oct 3, 2005
"Alturas" wrote …
"Trevor" wrote:
data formats are like standards – there are so many to choose from. I woudl
rather put my trust in JPEG2000, but that’s taking its time getting to the masses – anyone up to date on the Lizardtech claims?

Yes, what is the deal with JPEG2000? Write times too slow or something? It’s nearly 2006 and we still don’t see it in digicams.

Isn’t the problem that someone tied up the idea of wavelet compression with patents, meaning everyone else would have to pay to play (and making a horrible mess of it all even beyond the cost)? See how intellectual property law encourages innovation ;-)?
AT
All Things Mopar
Oct 3, 2005
Today ggull spoke these views with conviction for everyone’s edification:

"Alturas" wrote …
"Trevor" wrote:
data formats are like standards – there are so many to
choose from. I woudl rather put my trust in JPEG2000, but that’s taking its time getting to the masses – anyone up to date on the Lizardtech claims?

Yes, what is the deal with JPEG2000? Write times too slow or something? It’s nearly 2006 and we still don’t see it in digicams.

Isn’t the problem that someone tied up the idea of wavelet compression with patents, meaning everyone else would have to pay to play (and making a horrible mess of it all even beyond the cost)? See how intellectual property law
encourages innovation ;-)?

Yes, it actually /does/ encourage innovation by "encouraging" new inventors to create an even more superior version of a commodity. As has happened countless times in history, the original patented device has been eclipsed specifically /because/ someone was encourage to try harder.

And, to the original inventor, don’t they deserve the 20 years granted to a new patent holder to commercially profit from their invention?

So, nobody has to "pay to play", unless they get to the party too late to get people to buy their better mousetrap. So, if you snooze, you lose.


ATM, aka Jerry
BF
Bill Funk
Oct 3, 2005
On Mon, 03 Oct 2005 09:31:13 GMT, "Keith Sheppard" wrote:

Many years ago in the UK (and maybe elsewhere) we had two formats for video recordings – VHS and Betamax. Over the course of a few years, VHS eventually displaced Betamax and the latter sank without trace. Was this because VHS was better? No. In fact many people in the know say that Betamax was actually the technically superior. It was a marketing victory. The VHS camp managed to attract more manufacturers to their cause and eventually it became the de-facto standard.

My understanding is that Betamax had an advantage in specs ‘on paper’, but on the average TV, very few could see a difference.
When VHS doubled their recording time, and Sony refused to follow suit, Betamax started its downhill slide. More people wanted longer tapes than a tape that was so little better that they couldn’t see the difference.


Bill Funk
Replace "g" with "a"
funktionality.blogspot.com
CB
Chris Brown
Oct 3, 2005
In article ,
All Things Mopar wrote:
Today ggull spoke these views with conviction for everyone’s edification:

"Alturas" wrote …
"Trevor" wrote:
data formats are like standards – there are so many to
choose from. I woudl rather put my trust in JPEG2000, but that’s taking its time getting to the masses – anyone up to date on the Lizardtech claims?

Yes, what is the deal with JPEG2000? Write times too slow or something? It’s nearly 2006 and we still don’t see it in digicams.

Isn’t the problem that someone tied up the idea of wavelet compression with patents, meaning everyone else would have to pay to play (and making a horrible mess of it all even beyond the cost)? See how intellectual property law
encourages innovation ;-)?

Yes, it actually /does/ encourage innovation by "encouraging" new inventors to create an even more superior version of a commodity. As has happened countless times in history, the original patented device has been eclipsed specifically /because/ someone was encourage to try harder.

Demonstrably hasn’t happened with software patents. There are areas where research has basically stopped dead (e.g. text compression), because everybody is scared of getting sued if they come up with something new. You don’t even need to win in court if you’re a patent holder – just the threat of a long lawsuit which the innvoator can’t afford to fight, and is ultimately a lottery anyway is generally sufficient to make people settle. Software patents are no-more than a legalised protection racket.
T
toby
Oct 3, 2005
All Things Mopar wrote:
Today ggull spoke these views with conviction for everyone’s edification:

"Alturas" wrote …
"Trevor" wrote:
data formats are like standards – there are so many to
choose from. I woudl rather put my trust in JPEG2000, but that’s taking its time getting to the masses – anyone up to date on the Lizardtech claims?

Yes, what is the deal with JPEG2000? Write times too slow or something? It’s nearly 2006 and we still don’t see it in digicams.

Isn’t the problem that someone tied up the idea of wavelet compression with patents, meaning everyone else would have to pay to play (and making a horrible mess of it all even beyond the cost)? See how intellectual property law
encourages innovation ;-)?

"Intellectual property" law is not universally considered a meaningful term. It’s part of the Newspeak arsenal of patent lobbyists.

Yes, it actually /does/ encourage innovation by "encouraging" new inventors to create an even more superior version of a commodity.

And if *they* claim a patent on *that*, we’re back to square zero.

As has happened countless times in history, the
original patented device has been eclipsed specifically
/because/ someone was encourage to try harder.

And, to the original inventor, don’t they deserve the 20 years granted to a new patent holder to commercially profit from their invention?

So, nobody has to "pay to play", unless they get to the party too late to get people to buy their better mousetrap. So, if you snooze, you lose.

None of those points are germane to the *software* patent debate. See: http://www.gnu.org/philosophy/fighting-software-patents.html


ATM, aka Jerry
AT
All Things Mopar
Oct 3, 2005
Today Chris Brown spoke these views with conviction for
everyone’s edification:

Yes, it actually /does/ encourage innovation by
"encouraging" new inventors to create an even more superior version of a commodity. As has happened countless times in history, the original patented device has been eclipsed
specifically /because/ someone was encourage to try
harder.

Demonstrably hasn’t happened with software patents. There are areas where research has basically stopped dead (e.g. text compression), because everybody is scared of getting sued if they come up with something new. You don’t even
need to win in court if you’re a patent holder – just the threat of a long lawsuit which the innvoator can’t afford to fight, and is ultimately a lottery anyway is generally sufficient to make people settle. Software patents are
no-more than a legalised protection racket.

I’m still not convinced except when the Feds incorrectly grant a patent, as they did for M$’s double-clicking. Double-
clicking was first invented by Xerox for the ill-fated Star, then "stolen" and adapted to a single button mouse by Apple, then later stolen again by Bill Gates, who also stole the entire idea for the Windoze GUI. How F__k double-clicking is an "invention", when it is really a somewhat sophisticated timing loop, is beyond me.

(I commented jokingly a while back somewhere else that
Logitech and Apple are now paying Bill to use double-clicking on their mice).

If we go back to the maybe-OT comparison of VHS to Beta, that was clearly the result of trying to get around Sony’s patents.

But, as you commented on software, there is precious little that can actually be patented, as it is almost always an expression of an idea, and not an idea itself or an invention. I don’t know but could believe that algorithms can be patented without ever demonstrating that they even work, as there is no requirement for an invention to do what it claims to.

But, the original JPEG, if it is patented by the group from which it derives it name at all, must be right at or maybe even beyond the 20-year protection limit. It also seems far fetched to me that purveyors of graphics software have been paying royalties all this time.

I agree with you that protracted litigation, or the threat of it, is enough to drive off all but the wealthiest and most tenacious inventors. M$ succeeded very well against both Apple and Netscape, to name just two pathological examples.

Finally, I’m not an attorney nor involved in intellectual property protection nor an active programmer for the last 10 years, so I’d appreciate it if you could cite some (in)famous or well-known software patents that are for the algorithm, and not the code? Also, do any of the legal beagles here know if the original JPEG specification was or wasn’t copyrighted or patented, or whether it was intentionally placed in the public domain to speed up adoption?


ATM, aka Jerry
AT
All Things Mopar
Oct 3, 2005
Today toby spoke these views with conviction for everyone’s edification:

"Intellectual property" law is not universally considered a meaningful term. It’s part of the Newspeak arsenal of
patent lobbyists.

It is an entirely both appropriate and universally accepted term, and attorneys and entire law offices specialize in it. Lawyers are employed by anyone producing anything in order to find the best way to protect themselves, other than trying to sweep everything under the "trade secret" crap, which is doomed to failure by definition.

And, I personally dealt with the Patent and Copyright
attorneys for a decade during the latter days of my employment at Chrysler, and I /know/ that they vigorously defended Chrysler’s "intellectual property", even in the courts of foreign countries.

About the only major suit they failed to win was in trying to defend the Jeep
CB
Chris Brown
Oct 3, 2005
In article ,
toby wrote:
"Intellectual property" law is not universally considered a meaningful term. It’s part of the Newspeak arsenal of patent lobbyists.

Hardly – it’s gerenally accepted as being a short term for patent, copyright and trademark law.
T
toby
Oct 3, 2005
All Things Mopar wrote:
Today toby spoke these views with conviction for everyone’s edification:

"Intellectual property" law is not universally considered a meaningful term. It’s part of the Newspeak arsenal of
patent lobbyists.

It is an entirely both appropriate and universally accepted term, and attorneys and entire law offices specialize in it. …
And, I personally dealt with the Patent and Copyright
attorneys for a decade during the latter days of my employment at Chrysler, and I /know/ that they vigorously defended
Chrysler’s "intellectual property", even in the courts of foreign countries. …

Does any of that make it less bureaucratic Newspeak?

Yes, it actually /does/ encourage innovation by
"encouraging" new inventors to create an even more superior version of a commodity.

And if *they* claim a patent on *that*, we’re back to
square zero.

No! That is /precisely/ what advances the state-of-the-art in all "soft" and "hard" commodoties. Were it not for "need being the mother of invention", science would not advance.
None of those points are germane to the *software* patent debate. See:
http://www.gnu.org/philosophy/fighting-software-patents.html

That web site looks like it was posted by a pundit, not an attorney. Even the domain name is suspect, "philosphy". Now, cite some real examples pro or con to support your assertion.

I guess you haven’t heard of Richard Stallman.

–T

There’s no point in me trying to do that, as it is impossible to prove a null or negative hypothesis, and highly unlike to change your mind.


ATM, aka Jerry
RW
Roger Whitehead
Oct 3, 2005
In article , Chris Brown wrote:
"Intellectual property" law is not universally considered a meaningful term. It’s part of the Newspeak arsenal of patent lobbyists.

Hardly – it’s gerenally accepted as being a short term for patent, copyright and trademark law.

Quite so, and for designs.

The expression has been in use for over 150 years, viz:
"Only in this way can we protect intellectual property, the labors of the mind, productions and interests as much a man’s own as the wheat he cultivates.". Source: Woodbury & Minot – Reports of Cases Circuit Court of
U.S., 1847.

The World Intellectual Property Organization (WIPO), established in 1967, is an agency of the United Nations and has 182 nations as members. It’s hard to get more universal than that.

It sounds as though the news of the term’s wide acceptance has been a little slow in reaching the Antipodes. 😎



Roger
T
toby
Oct 3, 2005
Roger Whitehead wrote:
In article , Chris Brown wrote:
"Intellectual property" law is not universally considered a meaningful term. It’s part of the Newspeak arsenal of patent lobbyists.

Hardly – it’s gerenally accepted as being a short term for patent, copyright and trademark law.

However, it is also used to deliberately blur the distinctions between those arenas.

Quite so, and for designs.

The expression has been in use for over 150 years, viz:
"Only in this way can we protect intellectual property, the labors of the mind, productions and interests as much a man’s own as the wheat he cultivates.". Source: Woodbury & Minot – Reports of Cases Circuit Court of
U.S., 1847.

I did not know the coinage was as old as that. Thanks.

The World Intellectual Property Organization (WIPO), established in 1967, is an agency of the United Nations and has 182 nations as members. It’s hard to get more universal than that.

It sounds as though the news of the term’s wide acceptance has been a little slow in reaching the Antipodes. 😎

I did say ‘not universally’. Here’s a Northern hemisphere opinion: http://www.groklaw.net/article.php?story=20040805065337222



Roger
AT
All Things Mopar
Oct 3, 2005
Today toby spoke these views with conviction for everyone’s edification:

I guess you haven’t heard of Richard Stallman.

Nope. Who he?


ATM, aka Jerry
RK
Richard Kettlewell
Oct 3, 2005
All Things Mopar writes:
I’m still not convinced except when the Feds incorrectly grant a patent, as they did for M$’s double-clicking. Double- clicking was first invented by Xerox for the ill-fated Star, then "stolen" and adapted to a single button mouse by Apple, then later stolen again by Bill Gates, who also stole the entire idea for the Windoze GUI. How F__k double-clicking is an "invention", when it is really a somewhat sophisticated timing loop, is beyond me.

One might observe that patent offices have a clear economic interest in granting patents, but less so in actually understanding them.

Finally, I’m not an attorney nor involved in intellectual property protection nor an active programmer for the last 10 years, so I’d appreciate it if you could cite some (in)famous or well-known software patents that are for the algorithm, and not the code?

Software patents are for the algorithm and not the code. Code is protected by copyright instead.

LZW is the best known example, being the algorithm behind the GIF file format. More details:

http://www.gnu.org/philosophy/gif.html

Also, do any of the legal beagles here know if the original JPEG specification was or wasn’t copyrighted or patented, or whether it was intentionally placed in the public domain to speed up adoption?

The JPEG specification is copyrighted, but copyright on a specification does not in general preclude implementing it.

The algorithms required to implement baseline JPEG are (believed to be) patent-unencumbered, and this is deliberate.


http://www.greenend.org.uk/rjk/
AT
All Things Mopar
Oct 3, 2005
Today Richard Kettlewell spoke these views with conviction for everyone’s edification:

One might observe that patent offices have a clear economic interest in granting patents, but less so in actually
understanding them.

Come again? They can’t be generating all that much revenue from patents, although I’d certainly agree that nobody tries to understand how the inventions work. The wonks are way too busy scrutinizing the millions of existing patents to see if the new kid on the block has something all new, a derivative work, or simply the same as a previous patent just tweaked a little.

And, to the real point of this thread, even if a patent is granted for the new lossless-but-smaller compression scheme, there’s no legal requirement whatsoever for it even to work!

Software patents are for the algorithm and not the code. Code is protected by copyright instead.

That’s what I thought.

LZW is the best known example, being the algorithm behind the GIF file format. More details:

Who owns that and how much are they getting paid in royalties? And, how long has it been since the original patent was granted? Then, too, there’s several derivatives of the
original 256 color GIF.

http://www.gnu.org/philosophy/gif.html

This is the same web site I already think is bogus. And, wasn’t it CompuServe that invented GIF, and not Unisys (which didn’t exist) or IBM? Being a cynic, I can’t imagine IBM having enough sense to patent anything, after how easily they got hoodwinked by a very young, but very astute, Bill Gates. Gates had buried a single line in the middle of a 150-page agreement that gave him the rights to market his own version of PC DOS. And, the rest is history.

The JPEG specification is copyrighted, but copyright on a specification does not in general preclude implementing it.

I’d believe this. I would also believe that the photographers /wanted/ software developers to use the new standard. Maybe some people patented their algorithm for implementing the JPEG spec and maybe they didn’t. If they did, who’s paying the royalties to whom, and for how much longer will this go on?

The algorithms required to implement baseline JPEG are
(believed to be) patent-unencumbered, and this is
deliberate.

Then, how the hell is some new compression algorithm in anyway an infringement of JPEG?


ATM, aka Jerry
T
Tacit
Oct 3, 2005
In article ,
All Things Mopar wrote:

http://www.gnu.org/philosophy/gif.html

This is the same web site I already think is bogus. And, wasn’t it CompuServe that invented GIF, and not Unisys (which didn’t exist) or IBM?

That’s an easy one.

LZW is not a graphics format. LZW is a compression algorithm.

CompuServe invented GIF. Part of the GIF standard involves LZW compression. Each row of pixels in a GIF image is compressed using LZW.

CompuServe owns the GIF standard, which they created; but they had to lizense LZW compression, which was patented. (I say "was" because the patent has since expired.) It’s not the GIF specification that was patented; it’s the compression technique that GIF images use.


Art, photography, shareware, polyamory, literature, kink: all at http://www.xeromag.com/franklin.html
T
toby
Oct 3, 2005
All Things Mopar wrote:

LZW is the best known example, being the algorithm behind the GIF file format. More details:

Who owns that and how much are they getting paid in royalties? And, how long has it been since the original patent was
granted? Then, too, there’s several derivatives of the
original 256 color GIF.

Yes, people did make some creative attempts to avoid being sued. None were particularly successful except PNG
[http://www.libpng.org/pub/png/]. You might argue that this is a case where a patent sparked something new ‘of necessity’, but I still feel that ‘fear of litigation’ sounds more like a deadly discouragement than a social boon.

http://www.gnu.org/philosophy/gif.html

This is the same web site I already think is bogus.

Have you heard of the GPL? It comes from the same bogus source
[http://www.gnu.org/licenses/licenses.html]. And has changed the whole
bogus software industry for the bogus better.

And,
wasn’t it CompuServe that invented GIF,

It’s not about GIF. It’s about LZW, which Unisys happened to be able to produce a patent for (although there was much related independent invention).

and not Unisys (which
didn’t exist) or IBM? Being a cynic, I can’t imagine IBM having enough sense to patent anything, …

That’s a strange statement since they may have the largest patent portfolio of any tech company.

The JPEG specification is copyrighted, but copyright on a specification does not in general preclude implementing it.

I’d believe this.

I want to believe it too. Copyright is not a bad idea. A patent that precludes implementation is fatal.

I would also believe that the photographers
/wanted/ software developers to use the new standard. Maybe some people patented their algorithm for implementing the JPEG spec and maybe they didn’t. If they did, who’s paying the royalties to whom, and for how much longer will this go on?

Now you’re getting warm.

The algorithms required to implement baseline JPEG are
(believed to be) patent-unencumbered, and this is
deliberate.

Then, how the hell is some new compression algorithm in anyway an infringement of JPEG?


ATM, aka Jerry
KS
Keith Sheppard
Oct 4, 2005
For one thing, the original post cannot be mathematically correct. A 300 percent reduction in size would make the file size negative.

Oh dear, that’s got my imagination running riot. If we accept that as true, what does it mean? A compression algorithm so efficient that it has spare capacity to automatically absorb some of the surrounding filestore and reduce overall occupancy. Anyone got a few snaps in this format that I can borrow? I don’t want to look at them but my disk is getting a bit full and I could do with the extra space…

Keith
CB
Chris Brown
Oct 4, 2005
In article ,
All Things Mopar wrote:

http://www.gnu.org/philosophy/gif.html

This is the same web site I already think is bogus.

They produced the compilers, editors and software tools which half the software industry uses for its bread and butter work. What have *you* done that’s comparable? Bogus indeed…
RW
Roger Whitehead
Oct 4, 2005
In article , Toby
wrote:
I did say ‘not universally’.

It’s a matter of record that you did – http://tinyurl.com/9x44t .



Roger
HR
Howard Roark
Oct 4, 2005
Chris Brown wrote:

[…]

http://www.gnu.org/philosophy/gif.html

[…]

They produced the compilers, editors and software tools which half the software industry uses for its bread and butter work. […]

I believe you are confusing GNU with *nix. If you’re not, then "half the software industry" is a ridiculous exaggeration. Note that I do not agree with the person you were responding to. I’m just pointing out that you seem to overestimate GNU’s importance. 🙂

B.
K
kashe
Oct 4, 2005
On 3 Oct 2005 16:04:48 -0700, "toby" wrote:

All Things Mopar wrote:

and not Unisys (which
didn’t exist) or IBM? Being a cynic, I can’t imagine IBM having enough sense to patent anything, …

That’s a strange statement since they may have the largest patent portfolio of any tech company.

Definitely. In fact, at
<http://www.research.ibm.com/about/career.shtml> you will find:

Careers at Research

We don’t just invent, we innovate.

Researchers come to IBM to make an impact — on the industry and on the world. As the largest IT research organization, IBM Research enables IBM to produce more breakthroughs than any company in the industry, averaging 9.3 patents per day.
RK
Richard Kettlewell
Oct 4, 2005
All Things Mopar writes:
Richard Kettlewell writes:

LZW is the best known example, being the algorithm behind the GIF file format. More details:

Who owns that and how much are they getting paid in royalties? And, how long has it been since the original patent was granted? Then, too, there’s several derivatives of the original 256 color GIF.

The LZW patent has now expired. However, Unisys did ask for royalties from users of GIF while it was in force.

In my previous job I converted my employer’s software to generate PNG files instead, so that we didn’t have to pay up. Make of this what you will.

http://www.gnu.org/philosophy/gif.html

This is the same web site I already think is bogus. And, wasn’t it CompuServe that invented GIF, and not Unisys (which didn’t exist) or IBM?

The URL above explains. It is not "bogus" whatever you may think.

Being a cynic, I can’t imagine IBM having enough sense to patent anything, after how easily they got hoodwinked by a very young, but very astute, Bill Gates. Gates had buried a single line in the middle of a 150-page agreement that gave him the rights to market his own version of PC DOS. And, the rest is history.

IBM hold very large numbers of patents.


http://www.greenend.org.uk/rjk/
CB
Chris Brown
Oct 4, 2005
In article ,
Howard Roark wrote:
Chris Brown wrote:

[…]

http://www.gnu.org/philosophy/gif.html

[…]

They produced the compilers, editors and software tools which half the software industry uses for its bread and butter work. […]

I believe you are confusing GNU with *nix. If you’re not, then "half the software industry" is a ridiculous exaggeration.

I’m not, and it’s not. In particular, IME the first thing someone does when they get a Solaris box is install gcc, emacs, and the rest of the gnu tools on it (using a base-config Solaris installation is truly a hair-shirt experience). Elsewhere, Linux is pretty much the defacto standard for small to medium Internet server type stuff, and that’s pretty much entirely built on the gnu tools, as is OS X. When real software developers are forced to use Windows, they tend to go hunting for the Gnu tools as well, just to make life slightly less unbearable.
HR
Howard Roark
Oct 4, 2005
Chris Brown wrote:

In article ,
Howard Roark wrote:
Chris Brown wrote:

[…]

http://www.gnu.org/philosophy/gif.html

[…]

They produced the compilers, editors and software tools which half the software industry uses for its bread and butter work.
[…]

I believe you are confusing GNU with *nix. If you’re not, then "half the software industry" is a ridiculous exaggeration.

[snipped Solaris]

Elsewhere, Linux is pretty much the defacto standard for small to medium Internet server type stuff, and that’s pretty much entirely built on the gnu tools,

Linux is *not* the defacto standard for any type of server. The various BSDs are just as popular and, thankfully, they aren’t based on GNU.

as is OS X.

OS/X is based on FreeBSD which does *not* use GNU tools.

When real software developers are forced to use Windows, they tend to go hunting for the Gnu tools as well, just to make life slightly less unbearable.

You are generalising. Again, "half the software industry" is a gross overstatement.

B.
AT
All Things Mopar
Oct 4, 2005
Today tacit spoke these views with conviction for everyone’s edification:

LZW is not a graphics format. LZW is a compression
algorithm.

CompuServe invented GIF. Part of the GIF standard involves LZW compression. Each row of pixels in a GIF image is
compressed using LZW.

CompuServe owns the GIF standard, which they created; but they had to lizense LZW compression, which was patented. (I say "was" because the patent has since expired.) It’s not the GIF specification that was patented; it’s the
compression technique that GIF images use.

This thread is getting pretty far afield. Whether these older standards, compression algorithms, or software implementations were or weren’t protected seems to have little to do with the success or failure of a new scheme.

But, to succeed, the new proposed compression algorithm is going to have to overcome literally millions of users software systems and cameras. And, my assertion is that if it gets all locked up with a patent and the owner charges onerous royalties, no major software house will implement it, and it will quickly die.


ATM, aka Jerry
AT
All Things Mopar
Oct 4, 2005
Today toby spoke these views with conviction for everyone’s edification:

and not Unisys (which
didn’t exist) or IBM? Being a cynic, I can’t imagine IBM having enough sense to patent anything, …

That’s a strange statement since they may have the largest patent portfolio of any tech company.

That wasn’t my point. I’m sure that IBM has hundreds of thousands of patents on their obsolete mainframe technology. I was questioning the LZH connection to IBM.


ATM, aka Jerry
CB
Chris Brown
Oct 4, 2005
In article ,
Howard Roark wrote:

Linux is *not* the defacto standard for any type of server. The various BSDs are just as popular and,

That’s not the impression I’ve got from spending the last decade working in the semiconductor industry. I never saw a BSD installation (apart from SunOS 4, which doesn’t count).

thankfully, they aren’t based on GNU.

What are they compiled with?

as is OS X.

OS/X is based on FreeBSD which does *not* use GNU tools.

OS X is built with gcc and ships with loads of gnu tools, e.g.:

torch:~ cbrown$ uname
Darwin
torch:~ cbrown$ which tar
/usr/bin/tar
torch:~ cbrown$ tar –help | tail -1
Report bugs to .
CB
Chris Brown
Oct 4, 2005
In article ,
All Things Mopar wrote:
Today toby spoke these views with conviction for everyone’s edification:

and not Unisys (which
didn’t exist) or IBM? Being a cynic, I can’t imagine IBM having enough sense to patent anything, …

That’s a strange statement since they may have the largest patent portfolio of any tech company.

That wasn’t my point. I’m sure that IBM has hundreds of thousands of patents on their obsolete mainframe technology.

If you ever decide to visit planet Earth, you may discover that IBM is a software patent generating powerhouse, filing thousands per year.

I was questioning the LZH connection to IBM.

IBM have a patent on LZW which expires in 2006.
T
toby
Oct 4, 2005
Howard Roark wrote:
Chris Brown wrote:

[…]

http://www.gnu.org/philosophy/gif.html

[…]

They produced the compilers, editors and software tools which half the software industry uses for its bread and butter work. […]

I believe you are confusing GNU with *nix. If you’re not, then "half the software industry" is a ridiculous exaggeration. Note that I do not agree with the person you were responding to. I’m just pointing out that you seem to overestimate GNU’s importance. 🙂

Which is very difficult to do. At some point in the 1990s gcc, to name just one example, began to take over from vendor compilers for very good reasons (often better code, better maintained, portable, standard compliant, etc). The pendulum has swung back somewhat, but most of the large vendors still use gcc to build their systems and it’s by far the most common standard system compiler.

That’s not to mention the rest of the GNU software library, which is absolutely indispensable equipment (also BSD has its own versions of some of the tools).

"At least half" the software industry is an understatement if one is speaking of UNIX-based development. (When Windows is included, …oh why bother.)

B.
T
toby
Oct 4, 2005
Howard Roark wrote:
Chris Brown wrote:

In article ,
Howard Roark wrote:
Chris Brown wrote:

[…]

http://www.gnu.org/philosophy/gif.html

[…]

They produced the compilers, editors and software tools which half the software industry uses for its bread and butter work.
[…]

I believe you are confusing GNU with *nix. If you’re not, then "half the software industry" is a ridiculous exaggeration.

[snipped Solaris]

Elsewhere, Linux is pretty much the defacto standard for small to medium Internet server type stuff, and that’s pretty much entirely built on the gnu tools,

Linux is *not* the defacto standard for any type of server. The various BSDs are just as popular and, thankfully, they aren’t based on GNU.

as is OS X.

OS/X is based on FreeBSD which does *not* use GNU tools.

Nonsense. It’s GNU based and the system compiler is gcc. I suggest you try running it sometime. In line with Linux, sensibly enough, Apple prefers the GNU utilities to the similar BSD utilities in most instances.
[ http://www.macdevcenter.com/pub/a/mac/2005/09/27/what-is-dar win.html
]

The influence of FreeBSD is often overstated. I even had someone tell me recently that OS X was really "Linux". A lot of people like talking about OS X who have never used it, let alone developed on it, and probably never used a Mac either.

When real software developers are forced to use Windows, they tend to go hunting for the Gnu tools as well, just to make life slightly less unbearable.

You are generalising. Again, "half the software industry" is a gross overstatement.

As I said – only if you include Windows-based development… if you’re talking UNIX development, it’s probably an understatement (since Linux is totally GNU based, and has the lion’s share and growing).

B.
T
toby
Oct 4, 2005
Roger Whitehead wrote:
In article , Toby
wrote:
I did say ‘not universally’.

It’s a matter of record that you did – http://tinyurl.com/9x44t .

Yes, I did say ‘not universally’.
That’s three times now.



Roger
V
veldy71
Oct 4, 2005
In rec.photo.digital toby wrote:
OS/X is based on FreeBSD which does *not* use GNU tools.

Nonsense. It’s GNU based and the system compiler is gcc. I suggest you try running it sometime. In line with Linux, sensibly enough, Apple prefers the GNU utilities to the similar BSD utilities in most instances.
[ http://www.macdevcenter.com/pub/a/mac/2005/09/27/what-is-dar win.html
]

FreeBSD is NOT Gnu based. Some of the compiler tool chain is GNU based, but that is not the OS. In fact, you can install the base system and not install GNU tools.

The influence of FreeBSD is often overstated. I even had someone tell me recently that OS X was really "Linux". A lot of people like talking about OS X who have never used it, let alone developed on it, and probably never used a Mac either.

OS X is based upon FreeBSD 3.x … at least, the kernel branched from it. They even hired one of the FreeBSD project founders to help them out. IIRC, Hubbard was his [last] name.

As I said – only if you include Windows-based development… if you’re talking UNIX development, it’s probably an understatement (since Linux is totally GNU based, and has the lion’s share and growing).

The Linux OS [using the term very loosely here] is a hodge podge of licensed code. Some BSD, some GNU, some whatever else. Look at the source code for Netkit (telnet, ftp, etc).

The Linux Kernel is GNU licensed, although some of the kernel modules are not. I believe ALSA, the latest standard Linux sound code, is not GNU licensed, but that may have changed. I am not sure that reiserfs is GNU licensed either.


Thomas T. Veldhouse
Key Fingerprint: 2DB9 813F F510 82C2 E1AE 34D0 D69D 1EDC D5EC AED1 Spammers please contact me at
T
toby
Oct 4, 2005
Thomas T. Veldhouse wrote:
In rec.photo.digital toby wrote:
OS/X is based on FreeBSD which does *not* use GNU tools.

Nonsense. It’s GNU based and the system compiler is gcc. I suggest you try running it sometime. In line with Linux, sensibly enough, Apple prefers the GNU utilities to the similar BSD utilities in most instances.
[ http://www.macdevcenter.com/pub/a/mac/2005/09/27/what-is-dar win.html
]

FreeBSD is NOT Gnu based. Some of the compiler tool chain is GNU based, but that is not the OS. In fact, you can install the base system and not install GNU tools.

I was talking about OS X.

The influence of FreeBSD is often overstated. I even had someone tell me recently that OS X was really "Linux". A lot of people like talking about OS X who have never used it, let alone developed on it, and probably never used a Mac either.

OS X is based upon FreeBSD 3.x … at least, the kernel branched from it. They even hired one of the FreeBSD project founders to help them out. IIRC, Hubbard was his [last] name.

"a Mach 3.0-based microkernel that has been modified to include portions of FreeBSD for performance reasons" (from link cited). The name is Jordan Hubbard.

As I said – only if you include Windows-based development… if you’re talking UNIX development, it’s probably an understatement (since Linux is totally GNU based, and has the lion’s share and growing).

The Linux OS [using the term very loosely here] is a hodge podge of licensed code. Some BSD, some GNU, some whatever else. Look at the source code for Netkit (telnet, ftp, etc).

The Linux Kernel is GNU licensed, …

I wasn’t talking about licensing. I was talking about the GNU software, which has been labelled ‘unimportant’ in this thread; yet Linux is a good example of an O/S that uses all the relevant GNU utilities (rather than a BSD lex or yacc, for instance). Of course it also includes BSD utilities for other roles.

Anyway you and I have no disagreement, but others in this thread haven’t heard of GNU or how it underpins the larger part of the (non-Windows) industry.


Thomas T. Veldhouse
Key Fingerprint: 2DB9 813F F510 82C2 E1AE 34D0 D69D 1EDC D5EC AED1 Spammers please contact me at
RK
Richard Kettlewell
Oct 4, 2005
"toby" writes:
Howard Roark wrote:

You are generalising. Again, "half the software industry" is a gross overstatement.

As I said – only if you include Windows-based development… if you’re talking UNIX development, it’s probably an understatement (since Linux is totally GNU based, and has the lion’s share and growing).

I think it would be worth including the embedded software industry (which includes, not least, the firmware for digital cameras) in this kind of estimate.


http://www.greenend.org.uk/rjk/
D
davidjl
Oct 4, 2005
"Richard Kettlewell" wrote:
"toby" writes:
Howard Roark wrote:

You are generalising. Again, "half the software industry" is a gross overstatement.

As I said – only if you include Windows-based development… if you’re talking UNIX development, it’s probably an understatement (since Linux is totally GNU based, and has the lion’s share and growing).

I think it would be worth including the embedded software industry (which includes, not least, the firmware for digital cameras) in this kind of estimate.

I’m seeing a lot more gnu stuff used there of late: in Japan at least, gcc is the high-end compiler of choice.

David J. Littleboy
Tokyo, Japan
RW
Roger Whitehead
Oct 4, 2005
In article , Toby
wrote:
Yes, I did say ‘not universally’.
That’s three times now.

You seem proud of your error.



Roger
HR
Howard Roark
Oct 5, 2005
Chris Brown wrote:

In article ,
Howard Roark wrote:

Linux is *not* the defacto standard for any type of server. The various BSDs are just as popular and,

That’s not the impression I’ve got from spending the last decade working in the semiconductor industry. I never saw a BSD installation (apart from SunOS 4, which doesn’t count).

And who are you again?

thankfully, they aren’t based on GNU.

What are they compiled with?

You don’t even know the basics of OSes which existed for decades before Linux came along, yet you make sweeping statements about "defacto standards" and what the industry uses in "your experience".

Wow.

as is OS X.

OS/X is based on FreeBSD which does *not* use GNU tools.

OS X is built with gcc and ships with loads of gnu tools, e.g.:
torch:~ cbrown$ uname
Darwin
torch:~ cbrown$ which tar
/usr/bin/tar
torch:~ cbrown$ tar –help | tail -1
Report bugs to .

The GNU implementation of tar() is very good. I use it myself in favour of other implementations. Oh, wait, you don’t actually think that its presense in OS/X means OS/X is GNU based, do you? Oh dear, I rather think you probably do.

Well, I’ll let you go back to thinking you know what you’re talking about, now. I’ve found what I was looking for in the rpd group.

B.
R
Roberto
Oct 5, 2005
Chris Brown wrote:
That’s not the impression I’ve got from spending the last decade working in the semiconductor industry. I never saw a BSD installation (apart from SunOS 4, which doesn’t count).

I am not surprised. Working "in the semiconductor industry" has nothing to do with operating systems, per se. You may as well be a janitor anywhere.
V
veldy71
Oct 5, 2005
In rec.photo.digital Lorem Ipsum wrote:
Chris Brown wrote:
That’s not the impression I’ve got from spending the last decade working in the semiconductor industry. I never saw a BSD installation (apart from SunOS 4, which doesn’t count).

I am not surprised. Working "in the semiconductor industry" has nothing to do with operating systems, per se. You may as well be a janitor anywhere.

Yahoo has always run their servers on FreeBSD.


Thomas T. Veldhouse
Key Fingerprint: 2DB9 813F F510 82C2 E1AE 34D0 D69D 1EDC D5EC AED1 Spammers please contact me at
MR
Mike Russell
Oct 5, 2005
Re the Unix "banter". I’ll have to agree with Chris Brown and others, and disagree with most of HR’s statemtents and attitude, which are, IMHO, uninformative and inappropriate, respectively.

For those who care, I’ve been working with Unix, off and on, for a very long time indeed, starting at Berkeley in the early 70’s. Although I myself am only a minor figure in that world (Tut will back me up on that point :-), I am have met and spoken to most of the great personalities involved in the making of v6, BSD, and V.

Unix, perhaps because of it’s tumultous dependency on academic development effort, has turned out to be difficult to take to market, and many, including DEC, Novell, and AT&T, have spent more energy on infighting than in advancing the Unix market, which is now a fragmented rubble. Even Sun is having problems, about to go Open Source if they have not already, and OS’s like FreeBSD and others are marginalized to the point that Linux, together with the GNU software, is indeed the dominant star of the *nix world.

Stallman is a legit, if controversial figure, and deserves kudos for several reasons: surviving all this time with his philosophy vindicated, a large following with a body of work to point to, and a resunting impact on the commercial world that rivals any conventionally developed and marketed software package. And by that I mean everything except Windows 🙂 —

Mike Russell
www.curvemeister.com
R
Roberto
Oct 6, 2005
"Thomas T. Veldhouse" wrote in message

Yahoo has always run their servers on FreeBSD.

Is it out-of-the-box FreeBSD? I think not.

U*ix can be good. It can be _excellent_, and it can be secure IF it is thoroughly mediated by experts. People who go with Name Brand U*X and update just as soon as patches and new versions are released are screwed; that’s not mediation, it is just plain dumb.

I love BSD, possibly for romantic reasons (Oh, Bezerkeley then…) but I don’t trust *x any longer because, frankly, it’s grown so big I can’t hold the whole system in my head at once. A pared *x is cool. I don’t mean pared to throw out BSD, like the Unix the 3B2 had – which would run until you hit it with a sledge hammer – lots of times, but it was a controller, not exactly a vulnerable system.
T
toby
Oct 6, 2005
Lorem Ipsum wrote:

I love BSD, possibly for romantic reasons (Oh, Bezerkeley then…) but I don’t trust *x any longer because, frankly, it’s grown so big I can’t hold the whole system in my head at once.

So you switched to…what?

A pared *x is cool. I don’t mean pared
to throw out BSD, like the Unix the 3B2 had – which would run until you hit it with a sledge hammer –

Well, that applies to nearly all Un*x systems I’ve run. SunOS 4, NetBSD, even Linux come to mind.

–Toby (fan of PDP-11 UNIX among other things)

lots of times, but it was a controller, not
exactly a vulnerable system.
BO
Bryan Olson
Oct 6, 2005
toby wrote:
Roger Whitehead wrote:
Hardly – it’s ["intelectual property" is] gerenally accepted as being a short term for patent, copyright and trademark law.

However, it is also used to deliberately blur the distinctions between those arenas.

More importantly, to blur the distinction with between I.P. and real property. Some see an advantage in equating infringement to theft, and kids who make party-mixes to pirates.


–Bryan
R
Roberto
Oct 6, 2005
"toby" wrote in message
Lorem Ipsum wrote:

I love BSD, possibly for romantic reasons (Oh, Bezerkeley then…) but I don’t trust *x any longer because, frankly, it’s grown so big I can’t hold
the whole system in my head at once.

So you switched to…what?

VMS, aka: OpenVMS for a long, long time. Before that it was RSTS/e, (on 11/70s) which was definitely not secure, but it never pretended to be. Those were the days, eh?

Now I do not claim that I can keep VMS in my head either, but it was very well managed by the DEC team. Now who was the guy from DEC that went to Micro$oft and TRIED to implement some of VMS as WindoZe? I’m sure that Gates squashed his efforts like a bug. A shame.
BO
Bryan Olson
Oct 6, 2005
Lorem Ipsum wrote:
[…]
Now I do not claim that I can keep VMS in my head either, but it was
very
well managed by the DEC team. Now who was the guy from DEC that went to Micro$oft and TRIED to implement some of VMS as WindoZe?

Are you are thinking of David Cutler?

http://en.wikipedia.org/wiki/Dave_Cutler

I’m sure that Gates
squashed his efforts like a bug. A shame.

You’d be wrong. On the other hand, some of us are a been a bit dissappointed that Richard Rashid’s ideas haven’t had more influence on Microsoft OS’s.


–Bryan
R
Roberto
Oct 6, 2005
"Bryan Olson" wrote:
Are you are thinking of David Cutler?

http://en.wikipedia.org/wiki/Dave_Cutler

I’m sure that Gates
squashed his efforts like a bug. A shame.

You’d be wrong.

Thank you for the correction. I don’t wish David anything but success, for his sake, and for the rest of us.
T
toby
Oct 6, 2005
Mike Russell wrote:
Re the Unix "banter". I’ll have to agree with Chris Brown and others, and disagree with most of HR’s statemtents and attitude, which are, IMHO, uninformative and inappropriate, respectively.

The pseudonym itself is a tip-off.


Stallman is a legit, if controversial figure, and deserves kudos for several reasons: surviving all this time with his philosophy vindicated, a large following with a body of work to point to, and a resunting impact on the commercial world that rivals any conventionally developed and marketed software package. And by that I mean everything except Windows 🙂

History will judge Stallman to have been an overwhelmingly positive (and more substantial) pioneering contributor to software, while the overwhelmingly destructive efforts of M$ will achieve their deserved obscurity. Thankyou Richard, in the 20 years since I first learned of your philosophy, my worsening experiences with proprietary software only bear out what you’ve said all along.



Mike Russell
www.curvemeister.com
RW
Roger Whitehead
Oct 6, 2005
In article <scR0f.1629$>, Mike Russell
wrote:
Re the Unix "banter". I’ll have to agree with Chris Brown and others, and disagree with most of HR’s statemtents and attitude, which are, IMHO, uninformative and inappropriate, respectively.

None of which is on-topic for a Photoshop group.



Roger
CB
Chris Brown
Oct 6, 2005
In article ,
Howard Roark wrote:
Chris Brown wrote:

In article ,
Howard Roark wrote:

Linux is *not* the defacto standard for any type of server. The various BSDs are just as popular and,

That’s not the impression I’ve got from spending the last decade working in the semiconductor industry. I never saw a BSD installation (apart from SunOS 4, which doesn’t count).

And who are you again?

I’m the guy who wrote the software that the guy who designed your cellphone used to make sure the damned thing was going to work, and it ran on Linux and Solaris. Pleased to make your aquaintence.

thankfully, they aren’t based on GNU.

What are they compiled with?

You don’t even know the basics of OSes which existed for decades before Linux came along,

You seem to know a lot about what I do and don’t know.
AH
Andrew Haley
Oct 7, 2005
In rec.photo.digital Lorem Ipsum wrote:
"Andrew Haley" wrote in message
In rec.photo.digital toby wrote:

That’s odd – because my folders of 35mm/120 negatives, which carry considerably more information than any JPEG I’ve seen – take up relatively little space.

Relative to what? Colour film is about 100kbytes/mm^3, whereas current hard disc drives are at about 1.5 Mbytes/mm^3.

So the thread degenerates to minutiae. Unplug your drive. Now measure.

OK. Still the same.

See how it works?

Not really, no. My point was directly relevant to the text that I quoted. Your response seems irrelevant.

Andrew.
R
Roberto
Oct 14, 2005
Methinks the original spamming perp hasn’t seen JPEG2000 yet. Man, when is that EVER going to catch on?
BT
Bill Tuthill
Oct 14, 2005
In rec.photo.digital Lorem Ipsum wrote:
Methinks the original spamming perp hasn’t seen JPEG2000 yet. Man, when is that EVER going to catch on?

Perhaps never. Do you use it? I don’t.

Camera manufacturers do not support it in cameras, because RAW seems to be a better alternative for them.

For archiving, lossless JPEG 2000 is a bit smaller than PNG, but much less universal, and it is slow to encode/decode. Photoshoppers might prefer to archive PCD with adjustment layers.

For "high" quality web images [sic], even if JPEG 2000 were widely supported in browsers, it doesn’t look much better than quality 85-95 (IJG scale) with 1×1 chroma subsampling, and doesn’t save much additional space.

For low quality web images, JPEG 2000 does produce smaller and better looking files. That seems like a very small win considering the hype and proprietary dead-end.

I would be happy to hear arguments to the contrary.

How to Master Sharpening in Photoshop

Give your photos a professional finish with sharpening in Photoshop. Learn to enhance details, create contrast, and prepare your images for print, web, and social media.

Related Discussion Topics

Nice and short text about related topics in discussion sections