- LOCK - Format War - HD DVD vs. Blu-Ray - LOCK -

Status
Not open for further replies.
[quote name='CocheseUGA']I swear to God he tries to start shit in every thread.[/QUOTE]
I'm starting shit in the "Format War - HD DVD vs. Blu-Ray" thread. Wow. Ho-ly fucking shit, man, that's really outta line.

I'm pissing in the piss ocean, dude.
 
He's obviously never read a HD review before either.
They all talk extensively about the lossless tracks in comparison to the DD.
 
[quote name='dallow']He's obviously never read a HD review before either.
They all talk extensively about the lossless tracks in comparison to the DD.[/QUOTE]

I wouldn't know. I ignored him after that WV torture thread debacle.

Pretty soon, he's not going to have anyone responding to him, and personally I can't wait.
 
[quote name='CocheseUGA']Pretty soon, he's not going to have anyone responding to him, and personally I can't wait.[/QUOTE]
Please. I'm not nearly well-known enough on these boards for such a thing.

Also butts.
 
[quote name='CocheseUGA']I wouldn't know. I ignored him after that WV torture thread debacle.

Pretty soon, he's not going to have anyone responding to him, and personally I can't wait.[/quote]I heard about that too.
And thanks for quoting me, he got to read what I wrote.

He's just a lot of talk, I have zero people on my ignore list because I'm not a :baby: .

(feel free to quote this as well)
 
[quote name='dallow']I heard about that too.
And thanks for quoting me, he got to read what I wrote.

He's just a lot of talk, I have zero people on my ignore list because I'm not a :baby: .

(feel free to quote this as well)[/QUOTE]

You should check out that thread and see Sub's screenshot. It's like 15 straight posts by people who were on his ignore list.

I ignore people who either are too ignorant for me to waste my time on, or complete douchebags. He can pick the category he wants to be in. I used to get extremely pissed off at people (like, come find where you're at and beat your ass mad) but this way I can let the little peckerwoods argue and I not get involved.
 
[quote name='CocheseUGA']You should check out that thread and see Sub's screenshot. It's like 15 straight posts by people who were on his ignore list.

I ignore people who either are too ignorant for me to waste my time on, or complete douchebags. He can pick the category he wants to be in. I used to get extremely pissed off at people (like, come find where you're at and beat your ass mad) but this way I can let the little peckerwoods argue and I not get involved.[/quote]Hehe, I see.
I'll check out that thread as well, I've been meaning to.



Time to steer this thread though back on topic.

Go high definition media!
 
[quote name='dallow']Hehe, I see.
I'll check out that thread as well, I've been meaning to.



Time to steer this thread though back on topic.

Go high definition media![/QUOTE]

Yup. I wish my 360 would come back soon so I can watch my HD DVDs. I could watch them on my computer, but I just don't care to.
 
I'm just waiting for things to become more clear about who's ahead. Paramount going HD-DVD really ****'ed things up for me. I was almost 100% set on going Blu-ray on my library for new blockbuster titles, but now that Paramount went HD-DVD I'm really bewildered.

I haven't bought the HD-DVD add-on for the 360 entirely because of it.

As far as 1080p versus 1080i, there is definitely a noticeable quality when you're on the right set, but everything nowadays is in the eye of the beholder. I think if you're not going for Lossless audio, a standalone digital decoding receiver along with high end satellites and a standalone subwoofer, you're not really at the point of a home theater where the quality matters.

A lot of my friends try to get into these debates about the audio quality of DTS versus DD Ex and Blu-ray versus HD-DVD, yet they are using Optical out on their units, HDMI only for the video signal, so really they're only getting the 1080p visual performance.

I'm happy with my Daewoo upconverting to 1080i/720p for regular DVD's and I'll just stick with a DVD library until the tide turns again and the battle is 60/40 or worse.
 
[quote name='rodeojones903']The HD DVD drive for my 360 in my bedroom won't open any longer. Booo![/QUOTE]

Call them and refuse to pay for the repairs.

If you make a big enough fuss, you'll only be out the cost of shipping, 4 hours on the phone and a month of movie watching.
 
[quote name='dallow']No shit. But you said they're ONLY GOOD for that.

ALL sound benefits from higher resolution.[/QUOTE]

I agree. I've been wanting K-Pop in SACD or DVD-A. I know some Se7en disc came out supposedly but I can't find it in the audio archieves for SACD. It WAS listed on YesAsia however but now I can't find the listings.
Shame since I'd rather hear good K-Pop then the garbage spoiled brat mainstream American artists that get the 5 star audio treatment usually. Even though some of BOA's stuff is on SACD it's only on the Japanese stuff and I'd LOVE to get all her Korean stuff on SACD.
 
[quote name='dallow']It's a digital signal...
They do have boosters for the signal as well as it does have a cut off point.

But I've seen 50foot HDMI cables carry video and audio no problem.
I can't say for the same for some analog cables.

At some point, you will lose the picture. Then you know you need a booster.

With analog, the picture just degrades gradually.[/quote] see, now i know u didnt read the article :p

digital doesnt mean anything. there's no error checking in it, and the twisted pair is what introduces interference/noise (which boosters only make worse BTW). the only thing digital means is it doesnt have to usa a DAC/ADC. but like the article says the type cable has muuuuuuuuuch less to do w/ it, than what youre attaching it to.

anyway thats all besides the point; which was that all component for HD vid uses 1080i and the article explains why theres no diff between component and HDMI for HD (which implies 1080i and 1080p are identical for HD vid) and the thing that matters most is the quality of the player/tv. ;)
 
wow a whole page over arguign over lossy vs. lossless. of course lossless is going to be better, its losing less.

im surprised nobody mentioned 16 vs 24 bit. thats the real conundrum.

lossy 24 bit vs. lossless 16 bit. there was a poll over at AVS on this and it was split right down the middle, so that makes me think its highly dependant on how good the soundtrack is to begin w/.
 
[quote name='H.Cornerstone']After going from a 1080i to a 1080p TV, I can definitaly say that 1080p>>>>>1080i. And HDMI's cables are really cheap, and if you have a PS3 or 360, are cheaper than the component, and can get you better picture quality. So, if you have a 1080p TV, HDMI is definitaly the way to go.

And geko, I usually remember things you say because I am so shocked to see a HD-DVD fan boy with logical reasons for liking the format and facts to back it up. :)[/quote]
thats because its the TV, not the cable. like i explained to you waaayy back :p
the only 1080i TVs that exist are tubes. and they have 1/2 the horizontal resolution of 1080p. but 1080i and 1080p (the signals) have the same resolution. i know, its a little confusing. but a 1080i signal on a 1080p TV is displayed as 1080p just like a 1080p signal is (since its the same signal just sent shuffled). make sense?
 
[quote name='propeller_head']lossy 24 bit vs. lossless 16 bit. there was a poll over at AVS on this and it was split right down the middle, so that makes me think its highly dependant on how good the soundtrack is to begin w/.[/quote]
The only votes that matter to me are the ones of the (two, I believe) sound engineers whose entire job is to do A/B comparisons to make sure their soundtracks sound as good as they possibly can. They say a high-bitrate lossy 24-bit (DD+, DTS-HD) soundtrack is noticibly better than a 16-bit lossless (TrueHD, DTS-HD MA, PCM) one.

Since I unfortunately haven't had the opportunity to make such comparisons myself, I'll take the experts at their word.
 
[quote name='rodeojones903']The HD DVD drive for my 360 in my bedroom won't open any longer. Booo![/QUOTE]

1. Go to Best Buy and buy 360 HD DVD player
2. Return broken HD DVD player in the box
3. Tell them "I didn't know you needed a 360 to use it"

Bonus Points for having your hot GF/Wife doing it as they won't question it much. Make sure she is showing some cleavage.
Best Buy does NOT track serial #'s for the 360 HD DVD player.

:cool:
 
[quote name='geko29']The only votes that matter to me are the ones of the (two, I believe) sound engineers whose entire job is to do A/B comparisons to make sure their soundtracks sound as good as they possibly can. They say a high-bitrate lossy 24-bit (DD+, DTS-HD) soundtrack is noticibly better than a 16-bit lossless (TrueHD, DTS-HD MA, PCM) one.

Since I unfortunately haven't had the opportunity to make such comparisons myself, I'll take the experts at their word.[/quote]I'd love to read about this if you have links.
Are these the sound engineers who create the DD tracks from the master?
Surely they're proud of their work, but to say it's superior?

You have the equipment, I'm surprised you haven't done some comparisons on your own.

I haven't compared DD+ vs TrueHD/PCM (just DD vs LPCM) and the choice is clear in my tests.
And to AV reviewers as well.
 
[quote name='propeller_head']thats because its the TV, not the cable. like i explained to you waaayy back :p
the only 1080i TVs that exist are tubes. and they have 1/2 the horizontal resolution of 1080p. but 1080i and 1080p (the signals) have the same resolution. i know, its a little confusing. but a 1080i signal on a 1080p TV is displayed as 1080p just like a 1080p signal is (since its the same signal just sent shuffled). make sense?[/quote]

Unless you have a TV with a crappy deinterlacer in which case 1080p looks a lot better than 1080i. And of course 1080i and 1080p rae the same resolution, but 1080p still LOOKS better.
 
[quote name='H.Cornerstone']Unless you have a TV with a crappy deinterlacer in which case 1080p looks a lot better than 1080i. And of course 1080i and 1080p rae the same resolution, but 1080p still LOOKS better.[/quote] uh, no it doesnt. unless you have a tv w/ no deinterlacer. its not exactly a complicated process; its not like upscaling. all it does is rearrange/weave the scan lines. so instead of getting a,b,c,d,e,f in 1080p it gets a,c,e,b,d,f and just rearranges it again to a,b,c,d,e,f (the same way players do before they send 1080p signals) since all lcd, lcos, dlp are synchronous refreshes. 1080i was designed to make it simpler for tube tvs when it would actually refresh in an interlaced manner (& the electroncics were much slower). but as far as virtually all HDTVs now, 1080i and 1080p are one and the same; theres no degredation. the only degredation comes when you use a 1080i display (and that degredation would still be there even if you used a 1080p signal). even the earliest 1080p TVs didnt accept 1080p signals, they accepted 1080i. but they were still displaying at 1080p exactly the same as the 1080p sets which accept both now. a large reason players are outputting at 1080p too now is because consumers are naive. many would think that a 1080i signal doesnt work on their new 1080p HDTV; and some of the idiots who work at CC or BB might just try to tell them that (to sell higher priced items).

if a deinterlacer can weave the fields faster than it can receive them there's no temporal interferance. and i dont think youre going to find a deinterlacer in any name brand HDTV which cant do that w/ ease.

so, if a waiter gives your friend 2 8oz glasses of water and you 1 16oz glass of water, do you have more water? if he drinks both and you drink yours, is your stomach abosbing it better because you drank it out of one glass?

i cant make it any clearer than that.
 
[quote name='H.Cornerstone']Unless you have a TV with a crappy deinterlacer in which case 1080p looks a lot better than 1080i. And of course 1080i and 1080p rae the same resolution, but 1080p still LOOKS better.[/quote]uh, no it doesnt. unless you have a tv w/ no deinterlacer. its not exactly a complicated process; its not like upscaling. all it does is rearrange/weave the scan lines. so instead of getting a,b,c,d,e,f,g,h in 1080p it gets a,c,b,d,e,g,f,h and just rearranges it again to a,b,c,d,e,f,g,h since all lcd, lcos, dlp are synchronous refreshes. 1080i was designed to make it simpler for tube tvs when it would actually refresh in an interlaced manner (& the electroncics were much slower). but as far as virtually all HDTVs now, 1080i and 1080p are one and the same; theres no degredation. the only degredation comes when you use a 1080i display (and that degredation would still be there even if you used a 1080p signal). even the earliest 1080p TVs didnt accept 1080p signals, they accepted 1080i. but they were still displaying at 1080p exactly the same as the 1080p sets which accept both now. a large reason players are outputting at 1080p too now is because consumers are naive. many would think that a 1080i signal doesnt work on their new 1080p HDTV; and some of the idiots who work at CC or BB might just try to tell them that (to sell higher priced items).

if a deinterlacer can weave the fields faster than it can receive them there's no temporal interferance. and i dont think youre going to find a deinterlacer in any name brand HDTV which cant do that w/ ease.

so, if a waiter gives your friend 2 8oz glasses of water and you 1 16oz glass of water, do you have more water? if he drinks both and you drink yours, is your stomach abosbing it better because you drank it out of one glass?

i cant make it any clearer than that.

again 1080i display, yes slightly worse
1080p source material sent as 1080i signal to a 1080p display = 1080p source material sent as 1080p signal to a 1080p display
 
[quote name='dallow']I'd love to read about this if you have links.
Are these the sound engineers who create the DD tracks from the master?
Surely they're proud of their work, but to say it's superior?[/quote] Yes, they're the people who create the printmasters of the soundtracks, then test the compressed versions against their masters and make adjustments if necessary (they don't actually do the compression). I only remember the screenname of one of the gentlemen, so I can only search for his posts. Here's a few:

High bitrate lossy preferable to lossless with truncation
Lots of info about the mixing process
More than you could possibly want to know about hte mixing process
Why films are mixed in 48Khz instead of 96Khz
24-bit lossy more transparent to the original master than 16-bit lossless

You can find more, just search on user "Filmmixer" and "lossless" or whatever you're interested in.


[quote name='dallow'] You have the equipment, I'm surprised you haven't done some comparisons on your own.[/quote] I have the equipment, I don't have the software. None of my titles have 24-bit DD+ and 16-bit TrueHD. For the same bit depth, I definitely prefer the TrueHD, but I haven't been able to make that specific comparision, which was the subject of my post.

[quote name='dallow'] I haven't compared DD+ vs TrueHD/PCM (just DD vs LPCM) and the choice is clear in my tests.
And to AV reviewers as well.[/quote] Oh I agree, TrueHD/PCM beats the snot out of DD. Absolutely no question about it. But 1.5Mbps DD+ does too. DD at 384 (DVD), 448 (theater), or 640 (HD DVD/Blu-Ray) is just "too lossy".
 
[quote name='propeller_head']uh, no it doesnt. unless you have a tv w/ no deinterlacer. its not exactly a complicated process; its not like upscaling. all it does is rearrange/weave the scan lines. so instead of getting a,b,c,d,e,f,g,h in 1080p it gets a,c,b,d,e,g,f,h and just rearranges it again to a,b,c,d,e,f,g,h since all lcd, lcos, dlp are synchronous refreshes. 1080i was designed to make it simpler for tube tvs when it would actually refresh in an interlaced manner (& the electroncics were much slower). but as far as virtually all HDTVs now, 1080i and 1080p are one and the same; theres no degredation. the only degredation comes when you use a 1080i display (and that degredation would still be there even if you used a 1080p signal). even the earliest 1080p TVs didnt accept 1080p signals, they accepted 1080i. but they were still displaying at 1080p exactly the same as the 1080p sets which accept both now. a large reason players are outputting at 1080p too now is because consumers are naive. many would think that a 1080i signal doesnt work on their new 1080p HDTV; and some of the idiots who work at CC or BB might just try to tell them that (to sell higher priced items).

if a deinterlacer can weave the fields faster than it can receive them there's no temporal interferance. and i dont think youre going to find a deinterlacer in any name brand HDTV which cant do that w/ ease.

so, if a waiter gives your friend 2 8oz glasses of water and you 1 16oz glass of water, do you have more water? if he drinks both and you drink yours, is your stomach abosbing it better because you drank it out of one glass?

i cant make it any clearer than that.

again 1080i display, yes slightly worse
1080p source material sent as 1080i signal to a 1080p display = 1080p source material sent as 1080p signal to a 1080p display[/quote] ALL 1080p sets have deinterlacers. That's how interlaced sources get displayed on a progressive screen. It's not possible otherwise. And the reason the quality of that deinterlacer matters is because you have to put the RIGHT fields together. With a film-based (24p) source, this is especially difficult because the cadence is all screwy, and if it's done improperly, you get interlacing artifacts (where a frame is improperly built from fields belonging to two different frames), sometimes called stair-stepping or jaggies. It's HORRENDOUS on horizontal pans if the deinterlacing isn't happening properly.

So you need a decent deinterlacer that can recognize the goofy-ass 3:2:2 cadence that film-originated material shows up in, and properly reconstruct the original frames from the fields that are coming in. It's not an easy task, and many (especially low-end) 1080p sets aren't up to it. My 37" Westinghouse, for example, has a god-awful deinterlacer (I believe it's a low-end Genesis). 480i TV looks horrendous, as does 1080i HDTV and HD DVD sent as 1080i. You MUST give my set a 1080p signal, or the result is far beyond ugly. Luckily it's my PC and 360 montior, so I'm all good from that aspect. :)

On the other hand, some sets have quality deinterlacers such as those from ABT (Anchor Bay Technology) or Silicon Optix (the Reon or Realta), and they perform EXCEPTIONALLY well with interlaced inputs, such that there is absolutely no difference between a 1080i input and a 1080p input. But those sets are still relatively rare and generally more expensive. While videophiles were waiting for such sets to come on the market, the only alternatives were standalone video processors, which run anywhere from $2500 to $10k. That much of the technology in those processors is now starting to show up in high-end displays, receivers (onkyo 875/905), and DVD players (Denon 2930, Toshiba HD-XA2/HD-A35, Samsung BD-P1200/2400) is really quite remarkable.

If a set didn't have a deinterlacer, or more accurately, had a deinterlacer incapable of operating on 1080i inputs, then you have an even worse problem. The first-gen Sony SXRDs are a perfect example. They take a 1080i field, re-flag it as a 540p frame (called "bobbing"), then interpolate the missing lines to create a 1080p picture. You instantly lose half the resolution.
 
WRONG, not ALL 1080p sets have deinterlacers, 99.999% of ALL new HDTVs do, but there are 1080p sets which are just glorified monitors (like some old westinghouses, which youre right after 2005 used Genesis chips [btw, HT Mag has the LVM-37W1 as passing the SMPTE 133 deinterlace test, so i dont know whats wrong w/ yours]). and any new HDTV which is Labeled as 1080i compatible can do it well. JVC, Hitachi, Pioneer & Toshiba have been deinterlacing perfectly since 2002, its 2007 now. if this was 2001 your post would be accurate. Early HDTVs werent as good at this, BUT as i said; HDTVs you can find now do it perfectly, better than an $6,000 old Faroudja. now the reason is some of these early TVs DIDNT REALLY deinterlace, they just took the 540 lines of resolution and upconverted it one at a time; THAT IS NOT PROPER DEINTERLACING & any signal processor which does this does not deserve to be called deinterlacer BECAUSE IT ISNT ONE. if its done properly (as almost all new HDTVs labeled for 1080i do) there is NO difference AT ALL.

like i said, as long as it can deinterlace faster than it can recieve there will be no temporal interferance.

youre confusing 1080i SOURCE material being deinterlaced (like say NBC) with 1080p SOURCE material that is being sent as 1080i. 1080i SOURCE material received has to compensate for motion, all HD formats are 1080p source, ONLY the signal is 1080i; no motion has to be compensated for.

edit: oh great you edited it right after i started writing a reply lol
 
[quote name='propeller_head']
so, if a waiter gives your friend 2 8oz glasses of water and you 1 16oz glass of water, do you have more water? if he drinks both and you drink yours, is your stomach abosbing it better because you drank it out of one glass?

i cant make it any clearer than that.

again 1080i display, yes slightly worse
1080p source material sent as 1080i signal to a 1080p display = 1080p source material sent as 1080p signal to a 1080p display[/QUOTE]

I've never heard anyone explain it so clearly. Thanks, I really mean that. I kept thinking my Hitachi was crap because it only went to 1080i.
 
[quote name='Sporadic']And BDA goes into spin mode once again, this time against the new 51GB HD-DVD.



http://www.tech.co.uk/home-entertainment/high-definition/news/blu-ray-camp-responds-to-51gb-hd-dvd-claims?articleid=722237102

Yeah, 15 times 3 is 45 not 51 :cry: How is that possible?!? HOW DO YOU MAKE 45 INTO 51??? VOODOO? It's completely impossible that they could have come up with a way to fit more data on to the same layer. Also I have heard nothing about a production line for this odd format even though it was just approved by the DVD Forum less than 2 weeks ago. Plus there is no way that this odd disc will ever work on current players (please ignore the giant elephant in our room)

Looks like this was another surprise that the BDA had no idea about even though, unlike the Paramount deal, there has been rumblings about it for months.[/quote]
When the head of BDA says something as stupid as "Gamers don't buy movies" than you can come talk to me about "spin doctoring." And NO ONE Knew about the Paramount deal, so I don't blame the BDA for that.

ANd Propeller, I have seen a 1080p source material sent as 1080i (Hot Fuzz HD-DVD) and 1080p sent as 1080p (Casino Royale) to a 1080p TV, and for some reason Casino looked better.
 
[quote name='propeller_head']WRONG, not ALL 1080p sets have deinterlacers, 99.999% of ALL new HDTVs do, but there are 1080p sets which are just glorified monitors (like some old westinghouses, which youre right after 2005 used Genesis chips [btw, HT Mag has the LVM-37W1 as passing the SMPTE 133 deinterlace test, so i dont know whats wrong w/ yours]).[/quote] Because I have the LVM-37W3, which switched down from a midlevel Genesis chip to the low end one. I knew the deinterlacing performance was going to be piss poor before I bought it, because I read the 300+ (at the time) page thread at AVS of people doing HQV and other tests on it. But I didn't care, because I wasn't buying it as a TV. It's my PC monitor 95% of the time, and my Xbox runs it the other 5%. So I never have to worry about handing it an interlaced input.

It HAS a deinterlacer. And the deinterlacing sucks. This is the case for many less-expensive TVs. And thanks for bringing up HT magazine. From their 1080p RPTV shootout:
Every TV correctly deinterlaced 1080i/30, but only two were able to handle the 3:2 sequence from 1080i.
Like only one other TV in the Face Off (the JVC), the 565H correctly deinterlaces 1080i/30 and correctly detects the 3:2 sequence in 1080i material. This is no doubt due to the Pixelworks DNX chip.
The Mitsubishi's 3:2 pickup with 480i was about average, and, like most of the other TVs, the Mitsubishi didn't pick it up with 1080i.
The Sony's processing was pretty middle-of-the-road. It picked up the 3:2 sequence with the Gladiator clip but not with the Silicon Optix discs (neither 480i nor 1080i). The video processing was OK; the waving-flag scene from the same discs had only slightly jagged edges.
Processing was fairly average. The 62MX196 picked up the 3:2 sequence with 480i on both the Gladiator clip and the Silicon Optix disc. The waving flag from Silicon Optix had some jagged edges, but it wasn't too bad. While the 62MX196 deinterlaced 1080i/30 correctly, it wasn't able to pick up the 3:2 sequence (like most of the displays here).
The processing hasn't worsened from last year's model, either. Like only one other TV in the Face Off (the Olevia), the JVC was able to detect and process the 3:2 sequence in a 1080i signal. Few TVs on the market do this.
I believe that's EXACTLY what I said. Some deinterlacers suck, and don't do well with film (24p) originated source material transmitted as 1080i/60.

If they don't have a deinterlacer, how did they deinterlace 1080i/30? If they do have a deinterlacer and, as you claim, the quality of the deinterlacer doesn't matter, why do they screw up on film-based sources? Obviously the quality of the deinterlacing chip in your display matters if you're handing it a 1080i signal, ESPECIALLY if it comes from a film-based, progressive source.

Or maybe they're wrong too...

This is why H.Cornerstone had the experience he did. The lower-end Samsung models (which his friend has) do not have very good deinterlacer, and they can't handle the IVTC procedure properly. Many TVs fall into this category, which is why you see people all over AVS complaining about IVTC artifacts when they hook a 1080i player up to a 1080p set. They're not imagining it.

BTW, don't worry about the editing, I just added the part about the SXRDs bobbing 1080i inputs instead of deinterlacing them.
 
[quote name='H.Cornerstone']When the head of BDA says something as stupid as "Gamers don't buy movies" than you can come talk to me about "spin doctoring." And NO ONE Knew about the Paramount deal, so I don't blame the BDA for that.

ANd Propeller, I have seen a 1080p source material sent as 1080i (Hot Fuzz HD-DVD) and 1080p sent as 1080p (Casino Royale) to a 1080p TV, and for some reason Casino looked better.[/QUOTE]

Those are different movies. Of course they're going to look different. If you take Casino Royale and put it up again House of Flying daggers, Casino Royale will look better, but they're still both 1080p. You should try some Warner title that is available on both formats and has the same transfer. Happy Feet, 300, TMNT, etc.
 
[quote name='geko29']Because I have the LVM-37W3, which switched down from a midlevel Genesis chip to the low end one. I knew the deinterlacing performance was going to be piss poor before I bought it, because I read the 300+ (at the time) page thread at AVS of people doing HQV and other tests on it. But I didn't care, because I wasn't buying it as a TV. It's my PC monitor 95% of the time, and my Xbox runs it the other 5%. So I never have to worry about handing it an interlaced input.

It HAS a deinterlacer. And the deinterlacing sucks. This is the case for many less-expensive TVs. And thanks for bringing up HT magazine. From their 1080p RPTV shootout:
I believe that's EXACTLY what I said. Some deinterlacers suck, and don't do well with film (24p) originated source material transmitted as 1080i/60.

If they don't have a deinterlacer, how did they deinterlace 1080i/30? If they do have a deinterlacer and, as you claim, the quality of the deinterlacer doesn't matter, why do they screw up on film-based sources? Obviously the quality of the deinterlacing chip in your display matters if you're handing it a 1080i signal, ESPECIALLY if it comes from a film-based, progressive source.

Or maybe they're wrong too...

This is why H.Cornerstone had the experience he did. The lower-end Samsung models (which his friend has) do not have very good deinterlacer, and they can't handle the IVTC procedure properly. Many TVs fall into this category, which is why you see people all over AVS complaining about IVTC artifacts when they hook a 1080i player up to a 1080p set. They're not imagining it.

BTW, don't worry about the editing, I just added the part about the SXRDs bobbing 1080i inputs instead of deinterlacing them.[/quote] um you realize that the players are outputting 1080i/30 and that the quote you used said that ALL of the TVs were able to deinterlace that fine. again, like i already said, youre confusing 1080i SOURCE material w/ 1080p source material sent as a 1080i SIGNAL.

deinterlacing 1080i SOURCE material you have to compensate for motion. this is where the problems youre referring to come in. But, HD media is not 1080i source (well save for some discovery BDs i think, but even then if its sent as 1080p then the player has to do the same thing & can/will run into the same problems) most HD TV shows that are released on HD Media are mastered at 1080p source instead though.
 
[quote name='propeller_head']um you realize that the players are outputting 1080i/30 and that the quote you used said that ALL of the TVs were able to deinterlace that fine. again, like i already said, youre confusing 1080i SOURCE material w/ 1080p source material sent as a 1080i SIGNAL.[/quote] NO I'm NOT. It SPECIFICALLY says they have problems with 3:2 cadence detection. When do you use 3:2 cadence detection? When you have a film-based, progressive source! Why? Because 24=/=30.

When your source is 30fps, it's pretty easy to reconstruct. When it's 24fps, not so much. What did you THINK they were talking about?

Since you're still confused, here's what the HQV Benchmark flag-waving test they're talking about (which produced jaggies on most sets) :
As the flag waves, the stripes will show any jaggies that occur from incorrect deinterlacing.
Here's examples of one of their other deinterlacing tests. The source is 24fps progressive film from HD DVD or Blu-Ray:

%2FcontentEngine%2FcontentImages%2Fimg%5Fcompare%5Fcadenc%2Ejpg

See the difference? Again, this is with a 24fps FILM source, sent to the TV as 1080i/30, where the deinterlace/IVTC is NOT being done correctly. THIS is what they, and I, are talking about. Their instructions even tell you to make sure your HD DVD/Blu-Ray player is set for 1080i, otherwise you CANNOT test the set:
[quote name='HQV Benchmark Instruction Manual']When evaluating the HDTV make sure to put the HD DVD player into 1080i output mode.[/quote]
 
[quote name='geko29']NO I'm NOT. It SPECIFICALLY says they have problems with 3:2 cadence detection. When do you use 3:2 cadence detection? When you have a film-based, progressive source! Why? Because 24=/=30.

When your source is 30fps, it's pretty easy to reconstruct. When it's 24fps, not so much. What did you THINK they were talking about?

Since you're still confused, here's what the HQV Benchmark flag-waving test they're talking about (which produced jaggies on most sets) :
Here's examples of one of their other deinterlacing tests. The source is 24fps progressive film from HD DVD or Blu-Ray:

%2FcontentEngine%2FcontentImages%2Fimg%5Fcompare%5Fcadenc%2Ejpg

See the difference? Again, this is with a 24fps FILM source, sent to the TV as 1080i/30, where the deinterlace/IVTC is NOT being done correctly. THIS is what they, and I, are talking about. Their instructions even tell you to make sure your HD DVD/Blu-Ray player is set for 1080i, otherwise you CANNOT test the set:[/quote] yes but what youre not realizing is the player does this. the only time the TV does this is w/ again 1080i source (Like OTA NBC). what HD DVDs or BDs do you know of that run 30fps? that's the output of the players, 1080i/30. that's what the TV takes and uses for the pulldown. (the same 1080i/30 which you previously quoted HT Mag as saying ALL the tvs deinterlaced correctly). but the pulldown is already done, it just has to deinterlace it again. and again, the incorrect deinterlacing is from 1080i source material, not a 1080i signal. using a company that wants to sell you their deinterlacers as proof is not very convincing. the HQV test pic youre using is using 1080i source material because its designed to test the TVs ability to deinterlace 1080i source material. that is not the same thing as HD media, which again is 1080p source. the cadence problems you keep referring to have to do w/ the fields not being synchronous (aka 1080i source).
 
[quote name='propeller_head']yes but what youre not realizing is the player does this. the only time the TV does this is w/ again 1080i source (Like OTA NBC). what HD DVDs or BDs do you know of that run 30fps? that's the output of the players, 1080i/30. that's what the TV takes and uses for the pulldown. (the same 1080i/30 which you previously quoted HT Mag as saying ALL the tvs deinterlaced correctly). but the pulldown is already done, it just has to deinterlace it again. and again, the incorrect deinterlacing is from 1080i source material, not a 1080i signal. using a company that wants to sell you their deinterlacers as proof is not very convincing. the HQV test pic youre using is using 1080i source material because its designed to test the TVs ability to deinterlace 1080i source material. that is not the same thing as HD media, which again is 1080p source. the cadence problems you keep referring to have to do w/ the fields not being synchronous (aka 1080i source).[/quote]

I'm done. I can tell the difference, H.Cornerstone can tell the difference, thousands of professional reviewers can tell the difference. You can go ahead and believe whatever you want.
 
[quote name='geko29']I'm done. I can tell the difference, H.Cornerstone can tell the difference, thousands of professional reviewers can tell the difference. You can go ahead and believe whatever you want.[/quote]Hehe, that's how I felt with CoffeeEdge.
 
[quote name='geko29']I'm done. I can tell the difference, H.Cornerstone can tell the difference, thousands of professional reviewers can tell the difference. You can go ahead and believe whatever you want.[/quote] well i just dont see how. from a technical perspective. how does outputting 1080p source material w/ a synchronous 1080i signal introduce any artifacts? where are they introduced from? its certainly not from motion compensation or pulldown. like i said before, the only way i can see those problems arising is if it doesnt actually deinterlace it & uses that upscaling kludge where it doubles each 540 field to 1080 and attempts to use the higher refresh rate to trick the eyes similar to the old CRT trick. but that only existed on old cheap HDTVs (and isnt real deinterlacing), not anything you get today. like the 1080p roundup you quoted mentioned, all the TVs deinterlaced 1080i/30 fine. (TV manufacturers know that players output that which is why)

i'd love for you to be right, cause then that's something new i would have learned. but i really think youre wrong & confusing things. like when you were talking about stair-stepping. that has to do w/ the fields being out of synch (an inherent trait of 1080i source material); but it doesnt apply to 1080p source material. its not that is being deinterlaced wrong, so much as the signal processor which includes the deinterlacer sucks at interpolating 1080i source material field disparity.
 
Does anyone remember the good old days when all TVs were the same resolution, there were only a couple kinds of connectors, a single prominent video format, and that was that? Simpler, happier times!
 
[quote name='geko29']I'm done. I can tell the difference, H.Cornerstone can tell the difference, thousands of professional reviewers can tell the difference. You can go ahead and believe whatever you want.[/quote] well i just dont see how. from a technical perspective. how does outputting 1080p source material w/ a synchronous 1080i signal introduce any artifacts? where are they introduced from? its certainly not from motion compensation or pulldown. like i said before, the only way i can see those problems arising is if it doesnt actually deinterlace it & uses that upscaling kludge where it doubles each 540 field to 1080 and attempts to use the higher refresh rate to trick the eyes similar to the old CRT trick. but that only existed on old cheap HDTVs (and isnt real deinterlacing), not anything you get today. like the 1080p roundup you quoted mentioned, all the TVs deinterlaced 1080i/30 fine. (TV manufacturers know that players output that which is why)

i'd love for you to be right, cause then that's something new i would have learned. but i really think youre wrong & confusing things. like when you were talking about stair-stepping. that has to do w/ the fields being out of synch (an inherent trait of 1080i source material); but it doesnt apply to 1080p source material. its not that is being deinterlaced wrong, so much as the signal processor which includes the deinterlacer sucks at interpolating 1080i source material field disparity.

& unless im remembering wrong, H.Corner was talking about seeing a diff between his old 1080i TV. which is a whole 'nother story. 1080i source, 1080i signal, and 1080i display are all different things.
 
[quote name='propeller_head']& unless im remembering wrong, H.Corner was talking about seeing a diff between his old 1080i TV. which is a whole 'nother story. 1080i source, 1080i signal, and 1080i display are all different things.[/quote] I could be remembering incorrectly too, I'll submit that much. But my memory of his test was Hot Fuzz via 1080i component from a 360 addon and Casino Royale via 1080p HDMI from a PS3 to a Samsung 1080p LCD set with a substandard deinterlacer (according to user reviews on AVS). In his experience (again, if my memory serves), Casino Royale (well-reviewed but not in the top 10 best-looking Blu-Rays) looked substantially better on the same 2007-model set than Hot Fuzz (best-looking HDM title regardless of format). I attributed that to the poor deinterlacing performance.

This mirrors my own experience with playing with various gear combinations at BB/Magnolia roughly a year ago. I found some sets that looked SPECTACULAR with an A1 or A2 source (incapable of outputting 1080p), and some that looked shitty. The vast majority of sets looked as good as those few when they were fed via an A20 or a BD-P1200 or BDP-S1 at 1080p, elimnating any processing in the display. Based on that, my opinion that 1080p always looks good regardless of 1080p set, vs. 1080i looking just as good on a select few sets, was born.

I do realize and agree with you that a 1080i signal from a HD/BD player contains all the information necessary to create a 1080p picture, and a well-behaved set will make that profoundly obvious. Where I think we differ is that my experience says some sets can turn that 1080i into 1080p with absolutely zero loss, while many (including my Westy) cannot; and yours says that 1080i always looks the same as 1080p, regardless of the display device.

I think that is what we've been fighting about. Feel free to correct me if you think I'm wrong on that.
 
[quote name='CoffeeEdge']Does anyone remember the good old days when all TVs were the same resolution, there were only a couple kinds of connectors, a single prominent video format, and that was that? Simpler, happier times![/quote]

With shitty video that dreamed of being able to resolve a full 480 lines. No thank you, I'll take our current clusterfuck that sometimes gives us a full HD picutre vs. that hellhole. :)
 
[quote name='geko29']I could be remembering incorrectly too, I'll submit that much. But my memory of his test was Hot Fuzz via 1080i component from a 360 addon and Casino Royale via 1080p HDMI from a PS3 to a Samsung 1080p LCD set with a substandard deinterlacer (according to user reviews on AVS). In his experience (again, if my memory serves), Casino Royale (well-reviewed but not in the top 10 best-looking Blu-Rays) looked substantially better on the same 2007-model set than Hot Fuzz (best-looking HDM title regardless of format). I attributed that to the poor deinterlacing performance.

[/quote]
I was talking about both. I used to have a CRT HDTV< and now have a LCD HDTV that does 1080p and the difference is astounding. And seeing how CRT's have a much much higher contrast ratio and can do true black, the picture quality should be better, so I don't see how tube vs LCD makes a difference, but anywho, I have also seen Hot fuzz via component on a HD-DVD 360 add-on and a Casino Royale via HDMI on a Samsung 4061 TV and Casino Royale looked better, and I think Hot Fuzz is more highly regarded PQ wise.

And Geko, that picture looks exactly what I saw, sometimes I would see jaggies in the picture, now, they weren't THAT bad, but they were there, although uncommon. And Geko, is Hot Fuzz more highly regarded than Pirates of the Carribean: DMC? Because after seeing that movie, I don't know how anything could be better. :)
 
Yes, general consensus is (of course there are detractors but the vast majority agree) that Hot Fuzz is the best looking title regardles of format. POTC is a close second (both 1 & 2 being roughly equal PQ wise), with King Kong behind them. No matter how you shake it, HF, POTC, and KK are head and shoulders above CR, PQ wise. So the fact that CR looked better is quite telling, IMO.

PS. I am FAR beyond wasted (9 Appleton and cokes plus a few glasses of wine and two scotches--1x12 year, 1x16 year--at a friend's wedding), so apologies here for misspellings or general fuckups. The basic ideas are correct, I think. Tell you in the morning. :)
 
Pretty good interview with Denon concerning Profile 1.0 players not being able to do 1.1, BDA issues, and Denon seriously considering an HD DVD player (or a Combo)

http://www.listenup.com/content/partner_stores/denon/talmadge.aug.07.php

"But there is a possibility — and this is maybe not so public knowledge — that when these discs come out that fit this new profile, they may not work properly with the Profile 1.0 players."

LU: And a really good DVD player. And Denon is still agnostic, basically, about the high-def formats video formats, isn't it?

JT: We don't care. In the Blu-ray.com conversation from my last interview with you they were saying, "Well, Denon's a member of the Blu-ray Association." Yes, we are. That's public knowledge. That just means we're a member. That doesn't mean we're a deciding member. We don't have a license, we don't hold licenses on any technology in Blu-ray. There are like four or five gods at Blu-ray up there. But we're on Toshiba's doorstep every day talking to them, too. So, you know, we, as we've always tried to be with new technologies or new partners, to be as agnostic as possible, work with everybody, and give the customer the option--let them decide. That's really what it comes down to. You know, the Blu-ray sort of stems from that a little bit, but it gets us in the market. It shows that we are working with our dealers, that we are striving to be a part of the next high-definition player, whatever it may be, and it just seemed to work out a little bit easier, so that's where we are right now.

LU: So you would consider doing a combo player with Blu-ray and HD DVD?

JT: That's probably our number one consideration at this point in time. They're just starting to come out from some of the major players — obviously, Samsung has introduced theirs. They'll be out this fall, while the LG player, you know, it's not a true HD DVD player — it's not a licensed HD DVD product. LG has to make their own logo, it doesn't display menus for movies, and all this other stuff. So Samsung's going to be the first true combo player. And hats off to them.

As I, and others have mentioned, this will be a big issue with people who own current Blu-ray stand alone players. I have yet to hear OFFICIAL news that the PS3 will be able to handle the 1.1 profile (though it does have network capabilities built-in), but I am pretty sure the PS3 can handle 1.1 and 2.0.

Hopefully 1.1 discs will work on 1.0 players and its just special features that will issues and not the discs (movies) themselves. If so...this will cause extreme chaos.
 
[quote name='H.Cornerstone']When the head of BDA says something as stupid as "Gamers don't buy movies" than you can come talk to me about "spin doctoring."[/QUOTE]

It's true, if they actually did buy movies the attach rate for Blu-Ray wouldn't be an abysmal .07 while HD-DVD is at 4 (since everybody either buys the player or add-on for movies and not as some weird justification for paying out the ass for something)
 
[quote name='mykevermin']What specific player makes up 50% of HD-DVD players out there? Here's a hint: it's not the HD-A2.[/QUOTE]

And why do people buy that?

Is it so they can play the latest/greatest game? As a status symbol? As a futureproofing idea to justify the price for when they finally get an HDTV?

No.

They buy it specifically to play movies. The same thing can't be said for the PS3 and the attach rate prove it. You can argue all you want but the proof is in the pudding.
 
[quote name='mykevermin']What specific player makes up 50% of HD-DVD players out there? Here's a hint: it's not the HD-A2.[/QUOTE]

And why do people buy the add-on?

Is it so they can play the latest/greatest game? As a status symbol? As a futureproofing idea to justify the price for when they finally get an HDTV?

No.

They buy it specifically to play movies. The same thing can't be said for the PS3 and the attach rate prove it. You can argue all you want but the proof is in the pudding.
 
Why oh why can't they release Batman Begins on Blu-Ray? I'll probably pick up an HD-DVD add-on eventually, because I have to have this movie in hi-def.
 
Status
Not open for further replies.
bread's done
Back
Top