Forgot Password
Pentax Camera Forums Home
 

Reply
Show Printable Version Search this Thread
10-19-2014, 01:20 PM   #61
Veteran Member
dakight's Avatar

Join Date: Sep 2014
Location: Phoenix, AZ
Photos: Albums
Posts: 1,216
FWIW, Netflix has announced a premium plan for 4K streaming and they are adding content. It may not be totally mainstream now but it won't be long until it is. Personally I don't have anything to display it.


Multiple CPUs would be truly cool and wait until they add Bluetooth and give you the option to connect a keyboard, mouse and external display. Personally I think Bluetooth would be more useful than WiFi.

10-19-2014, 01:28 PM   #62
Senior Member
Paul MaudDib's Avatar

Join Date: Jul 2009
Location: Michigan
Posts: 294
QuoteOriginally posted by normhead Quote
In my definition, mainstream is mainstream as in, everyone has one. You and I have different definitions.
Yes, I mean mainstream as in "Joe Sixpack could plausibly afford one if he went out and bought a TV right now", not as in "everyone owns one right now". Your definition of "mainstream" is basically "technology from 5 years ago" because that's how long it takes to hit 100% market penetration.

When a 4K TV costs the same as an equivalent 1080p screen - that's "mainstream" to me.

Again comparing prices on Amazon - a 39" 1080p panel is $330 (their #5 top seller), a Seiki 39" 4K panel is $330. I don't care how much you're getting ripped off at FutureShop. Here we call them Best Buy, and they're a ripoff here too. Last time I was in one, they tried to charge me $25 for a 8" SATA cable. They're slowly going bankrupt and good riddance. Awful prices and an unpleasant experience.

QuoteQuote:
I go to Futureshop and the cheapest system is 999.99 Futureshop is kind of like a mainstream store. SO yes they are there. But, what use are they?

Whoa, where's the connectivity for the 6.1 "latest greatest" DTS Pemium Sound system I have in the basement. They're still selling 1080p units that cost 3 times the price of this unit. And I still can't figure out what the heck I'd do with it.
Yeah, content is the achilles' heel right now. Getting ahold of movies in 4K is tough, but 4K on triple-layer Bluray are coming early next year. And you can stream 4K from Netflix and some other digital delivery services right now.

As for sound - seems like you want a receiver with HDMI passthrough. They're out there on the market, here's a Sony for $300. I don't know what you're complaining about. Worst case, do multichannel analog and route it through your old receiver. Is this just an attempt to rack up the numbers by pretending you need to buy a whole new stereo system too?

QuoteQuote:
I have a 6.1 DTS sound system, and very little it could be used for. It sits in a box in my basement, but I am so ahead of the curve.
I don't know why you haven't plugged in your stereo system, honestly. My guess is that you're kind of a greybeard who's not really into the whole "home cinema" experience, or having the latest camera tech, and so on. Neither am I - I have a 2.1 sound system (2 speakers + a woofer) and an XGA projector. Something like 120" diagonal and I'm in it for something like $300, excluding the laptop that drives it, and I enjoy the crap out of it. My 16mp digital camera does everything I need - 720p video, high-ISO, and cost me $400 with kit and normal lenses. Everything is used. Would I accept that quality if I was shelling out top dollar for brand new tech? Nope.

But you were the one who bought it, if you don't like it flip it on eBay or something, don't pretend you're some kind of parable just because you're too lazy to plug in your stereo. Bluray disks usually have DTS streams, you're missing out for no good reason.

QuoteQuote:
Whoa, now this is a kicker....
"How close can I sit before I see pixels" isn't the proper way to evaluate pixel pitch. Again, consider Apple Retina displays - the entire point is that for normal usage there is more resolution than you can actually see, which makes the image very smooth and eases eye strain. It's in effect a form of "super-sampling" for your eyeball.

You can protest tech specs all you want, but people really like high-PPI displays. To the extent that Apple is rumored to be phasing out the non-Retina Macbook models sometime this year. The real-world feedback from people who have used them - rather than measurebating specs on the internet - is a resounding YES PLEASE.

I love my high PPI phone screen, myself. I have a Moto G, which would be sold as a Retina if it were Apple brand. Looks great.

Last edited by Paul MaudDib; 10-19-2014 at 02:38 PM.
10-19-2014, 01:59 PM   #63
Veteran Member




Join Date: Jun 2009
Posts: 11,913
QuoteOriginally posted by Paul MaudDib Quote
Yeah, content is the achilles' heel right now.
It remains so even for HD broadcast content. Certainly on free-to-air TV. In Australia, in my region, of the 25 or so free-to-air digital channels, if you observe the signal specs as you switch channels, 90% of the content remains SD. Only two channels are predominantly HD, but even then, not exclusively so.

If you go with premium paid satellite or 'cable' digital TV [Foxtel etc] there are more HD options, but even then most of the content remains SD. While a lot of the sports footage comes out in HD, hardly any of the news footage, for example, comes out as HD. No 'news-gatherers' seem to bother with working in HD yet.

If even HD format content is taking so long to dominate the conventional broadcast airwaves, I can't imagine how long it will take before 4K is common.
10-19-2014, 02:11 PM   #64
Senior Member
Paul MaudDib's Avatar

Join Date: Jul 2009
Location: Michigan
Posts: 294
QuoteOriginally posted by rawr Quote
If even HD format content is taking so long to dominate the conventional broadcast airwaves, I can't imagine how long it will take before 4K is common.
Another fun trick they pull is cutting the bandwidth per channel. So it's "HD" (airquotes) but they're cutting the data rate in half to squeeze in more channels. That one's common on satellite TV too.

Honestly I don't know if it will ever be "common" in the sense that you'll have 500 4K channels. It just takes so much bandwidth to send 4K signals, and there is physically only so much spectrum. You will probably never see "over-the-air" radio-broadcast 4K. Cable delivery, probably the movie channels and a few major TV networks. But you probably won't have The Home Shopping Channel in 4K until we all have our houses wired up with gigabit fiber-optic.

In countries with good internet service 4K digital delivery will work. The benchmark is about 15MBps for 4k content. And for regions with slower internet but high bandwidth caps you will also probably see "near-realtime" systems too - you say that you want to watch a movie, your Roku starts buffering up the movie, you come back a half hour later and start watching it.

For canadians/australians/etc with low bandwidth and harsh caps - you'll probably be watching movies on disc. Coming early next year.

But really - it's a home theater technology, movies are the big draw on 4K. It's about being able to watch a blockbuster film in the best possible quality. Is stopping at a Redbox such a burden? Or waiting for Netflix to mail you the disc?


Last edited by Paul MaudDib; 10-19-2014 at 02:29 PM.
10-19-2014, 03:55 PM   #65
Veteran Member




Join Date: Jun 2009
Posts: 11,913
QuoteOriginally posted by Paul MaudDib Quote
movies are the big draw on 4K. It's about being able to watch a blockbuster film in the best possible quality.
Hmmm. That makes it a hard sell to me. 4K display won't improve the narrative, or the hackneyed Hollywood themes of most blockbusters. Not worth paying extra just to see the same old stuff in higher res.

Gaming, home video/photo viewing/editing make 4K much more interesting, IMHO.
10-19-2014, 04:09 PM   #66
Pentaxian
normhead's Avatar

Join Date: Jun 2007
Location: Near Algonquin Park
Photos: Gallery | Albums
Posts: 40,450
I don't use my 6.1 system, because I have no 6.1 content, and the 5.1 system I have is great. Yamaha Amp and 3 Yamaha NX30 speakers,The 6.1 system came with the new wife... now 5.1... there's more difference between 5.1 and stereo than there was between stereo and mono. 5.1 to 6.1, was a marketing decision, not a quality of sound decision. I would never give up my 5.1 sound and music DVDs no matter what new system came out. I know of no plans to do 4k broadcast TV, anywhere, just lots of hype. My internet isn't fast enough, there isn't even 1080p TV yet. I don't have 1080p TV that my TV would support, and I paid $179 for a TV which I can watch happily from 12 feet away, which is a distance I find comfortable But honestly I have to ask. If I like sitting 12 feet away from my 32 inch TV, why would I want to sit 5 feet away from a 65 inch TV? Just to see the extra resolution I'm paying for?

You keep talking about Retina displays, but those are items you typically hold in your hand, or in your lap, as in less than 5 feet away, where it could conceivably make a difference. I'm just playing devil's advocate here, but still, I'm not impressed with the arguments for, and the arguments against are pretty solid.

The current version of the HDMI specification (1.4a) can only output a 4096×2160 resolution at a refresh rate of 24 Hz or 3840×2160 at 30 Hz—the latter, half that of what we’re used to on TVs and monitors. Connect up a 4K monitor at 30 Hz via HDMI and you’ll see choppier animations and transitions in your OS. You might also encounter some visible motion stuttering during normal use, and you’ll be locked to a maximum of 30 frames per second for your games—it’s playable, but not that smooth.

QuoteQuote:
“…the flicker-induced eyestrain is phenomenal…” – Joel Hruska
“After experimenting with the display in that operating mode [30 Hz] I’m going to flatly state no one will want to use it in anything but an emergency—the flicker-induced eyestrain is phenomenal,” writes Hruska, reviewing the Dell UP2414Q.
“Let me be clear: 30 frames per second sucks. All of my colleagues and I would prefer displays with 60 frames per second or better, all else being equal. But all else is not equal, which means that while 30 frames per second is more bothersome to some than others, we all tolerate it,” writes Brian Hauer, who has been writing about using Seiki’s $500 4K 39-inch HDTV as a computer monitor.

An HDMI upgrade to fix these issues isn’t widespread yet, but it’s on the horizon. HDMI 2.0 supports a full 60 Hz refresh rate at 3840×2160. First-person-shooter fans will finally get to game at a glorious 60 frames per second; desktop users will enjoy smoother screens and movement—no mouse lag either. Unfortunately, HDMI 2.0 is a hardware upgrade. You’ll need HDMI 2.0 on your GPU and monitor, and the only way to get it is to buy brand-new hardware (when it arrives).

DisplayPort has enough bandwidth to deliver 4K at 60 Hz, but most 4K monitors can’t accept it natively. Instead, they employ a clever workaround: they pretend that their giant 4K picture is actually two tiled displays—each 1920×2160, and each running at 60 Hz. DisplayPort transmits both “displays” simultaneously from your computer to your monitor, which the latter seamlessly combines into one giant 3840×2160 picture Some monitors can pull the same trick with two HDMI cables.

Unfortunately, it’s only clever when it works. Since the monitor is set up to act as if it’s two separate displays, screens that are permanently set to appear on only one display in a multi-display environment (in-game menus, your BIOS screen, your POST screen, etc.) can look pretty horrible.

Last edited by normhead; 10-19-2014 at 04:55 PM.
10-19-2014, 06:11 PM   #67
Forum Member




Join Date: Oct 2013
Location: Vancouver, BC
Posts: 84
QuoteOriginally posted by Kerrowdown Quote
What's this "if" word doing in that sentence? I understood it to be a certainty.
Sorry to confuse you, but I used 'if' since I am not certain that Pentax FF is coming.
Not that I don't 'ever' see it coming as I, too, want Pentax FF, and have high hope for it.
It's just that I don't think two unmarked lenses are enough evidences for FF yet.
I believe they are all speculation and rumours until officially announced... or maybe it's just me lacking faith until I see the 'sign'.
We will probably see next year, what those lenses were meant to be for, and then we can expect (or not) for certain...

10-19-2014, 06:46 PM   #68
Veteran Member




Join Date: Dec 2007
Photos: Gallery
Posts: 8,237
"Future Proof", and more data = more play

QuoteOriginally posted by Paul MaudDib Quote
Yeah man display resolutions have never increased in the past, it's all a gimmick, you tell 'em! Viva la IBM 5153! Long live CGA! NTSC now, NTSC forver!

Frankly it doesn't really matter whether or not you believe it - 4K is a mainstream professional standard nowadays, and has been for years. Red One came out in 2007 and 4K displays have been around for years. Nowadays the tech has percolated down to consumer price ranges - the Sony A7S shoots it, and there's displays in the $300-500 price range to play it. Starting in 2015 you will even be able to buy movies in 4K on three-layer Blueray. Ostrich all you want about it, it won't change the facts - 4K is here to stay.



Print resolutions are even more demanding than display resolutions. Again, the ability to crop away most of an image and have something left that's worth displaying/printing is pretty useful. Maybe you don't feel like you need it, but it is nice.

I love being able to do that with 6x7 - cropping away 3/4 of a 6x7 negative leaves me with roughly a 35mm negative, which is still a usable amount of resolution. I can make a superb 8x10 print out of that, or probably larger if budget allowed.

Furthermore - 3d stereo lenses cut your resolution in half again. They're definitely cool too - I love me some stereographs. Cross your eyes until the two images cross in the middle, then defocus your eyes onto the virtual image in the center. I'll admit 3d images are gimmicky to most photographers, but I think they're fun. Most TVs will do it nowadays, too.
More MP:

1) will more completely future proof your images (see above)
2) Will allow for deeper "telephoto" cropping while maintaining good IQ
3) Will allow for creative use of different aspect ratios without worry of image degradation at same display sizes (square crop, 5:4 crop, etc)
4) Will allow for... "more" PP. I say "more" because I can't think of a better adjective right now, but it allows you more NR and sharpening at the same time, for example. I found my D800 images more fun to play with than my D700 images. More data = more play.

Also, something that you can't determine by reading specs, you have to see it - the same 'amount' of noise in higher MP images looks better, even if there's just as much of it. Finer grained, bringing more detail, more pleasing-looking images.

.
10-19-2014, 07:32 PM   #69
Senior Member
Paul MaudDib's Avatar

Join Date: Jul 2009
Location: Michigan
Posts: 294
QuoteOriginally posted by normhead Quote
You keep talking about Retina displays, but those are items you typically hold in your hand, or in your lap, as in less than 5 feet away, where it could conceivably make a difference. I'm just playing devil's advocate here, but still, I'm not impressed with the arguments for, and the arguments against are pretty solid.
Apple's Retina standard is not really a fixed standard - in devices you hold up to your face the PPI is higher (eg iPhone: 321 ppi). In devices that are farther away from your face the PPI is lower. That is how this is accounted for.

In comparison a 50" 1080p screen is about 44 PPI. Whether you're sitting far away or not, that's pretty low. A 50" 4k screen bumps that to 96 PPI. Apple's next-gen iMac is moving to a 27" 5K display (218 ppi), which I do think is suggestive of what they'd put into a TV.

QuoteQuote:
The current version of the HDMI specification (1.4a) can only output a 4096×2160 resolution at a refresh rate of 24 Hz or 3840×2160 at 30 Hz—the latter, half that of what we’re used to on TVs and monitors. Connect up a 4K monitor at 30 Hz via HDMI and you’ll see choppier animations and transitions in your OS. You might also encounter some visible motion stuttering during normal use, and you’ll be locked to a maximum of 30 frames per second for your games—it’s playable, but not that smooth.

That's correct that a single HDMI port cannot support 4K at present. Manufacturers have got a couple workarounds for this: some use a pair of HDMI ports ganged together or others run DisplayPort, which is capable of supporting 60hz 4K natively.

So your article is kind of deceptive - yeah, the 30hz screens (eg Seiki) are only 30hz. That's what it says right on the box. But there's 60hz screens on the market, from Asus, Samsung and others at the $500 price point.

Sure there are probably teething problems with games whose programmers didn't figure people would be ganging up a couple display channels. It'll be supported going forward now that the issue is known. Some problems with menus is not really that big a deal, and there's always the possibility of some clever workaround in the graphics drivers. If you can get the game to render on 1x4k screen and slice it up in drivers, instead of the game rendering on 2xhalf-4K screen then you're in business.

30hz is also much more of an issue for games and desktop usage than movies. As your article said, everyone would prefer 60hz in a vacuum, but if you're not gaming then it's not really a big deal. Most video is only 30 hz (30 fps) anyway.

I also love how we've shifted from "4k video will never be adopted" to "well, just don't buy a 4k panel right now, there's teething issues". Kinda similar to the shift that people on this forum will be making if Pentax ever gets a FF camera on the market.

QuoteOriginally posted by rawr Quote
Gaming, home video/photo viewing/editing make 4K much more interesting, IMHO.
Home video and photo viewing/editing would certainly benefit as well.

Last edited by Paul MaudDib; 10-19-2014 at 08:11 PM.
10-20-2014, 03:13 AM   #70
Loyal Site Supporter
Loyal Site Supporter




Join Date: Mar 2009
Location: Gladys, Virginia
Photos: Gallery
Posts: 27,650
QuoteOriginally posted by jsherman999 Quote
More MP:

1) will more completely future proof your images (see above)
2) Will allow for deeper "telephoto" cropping while maintaining good IQ
3) Will allow for creative use of different aspect ratios without worry of image degradation at same display sizes (square crop, 5:4 crop, etc)
4) Will allow for... "more" PP. I say "more" because I can't think of a better adjective right now, but it allows you more NR and sharpening at the same time, for example. I found my D800 images more fun to play with than my D700 images. More data = more play.

Also, something that you can't determine by reading specs, you have to see it - the same 'amount' of noise in higher MP images looks better, even if there's just as much of it. Finer grained, bringing more detail, more pleasing-looking images.

.
The best way to "future proof" your images is to print them. I hope no one thinks that the majority of images taken today will still be around in ten years. The issue isn't whether or not they are viewable on a 5K screen, it is whether or not they'll just be in some sort of format and retrievable.

I think there is certainly a point of diminishing returns. Going from a K20 to a K5 was magical -- not because you were going from 14 to 16 megapixels, but because the pixels were better. Going from 16 megapixels to 24, not so much. On certain photos, you can see a little more detail, but it is a lot tougher to get pixel level sharpness and the files are 50 percent bigger. I think the same will be true going from 36 to 50 megapixel full frame. I hope Pentax doesn't go with larger than 36.
10-20-2014, 07:18 AM   #71
Veteran Member
LensBeginner's Avatar

Join Date: Sep 2014
Photos: Albums
Posts: 4,696
QuoteOriginally posted by Paul MaudDib Quote
Yes, I mean mainstream as in "Joe Sixpack could plausibly afford one if he went out and bought a TV right now", not as in "everyone owns one right now". Your definition of "mainstream" is basically "technology from 5 years ago" because that's how long it takes to hit 100% market penetration.

When a 4K TV costs the same as an equivalent 1080p screen - that's "mainstream" to me.
Oh but 4K TVs are mainstream...
Problem is, in order to drive a 4K monitor you need a heck of a graphic card... you can't just plug it in the HDMI port of a graphic card and expect 4K@60fps...
10-20-2014, 11:08 AM   #72
Veteran Member




Join Date: Dec 2007
Photos: Gallery
Posts: 8,237
QuoteOriginally posted by Rondec Quote
The best way to "future proof" your images is to print them.
In the end those photos could last 100 years if printed with longevity in mind, so it is one of the best ways to future-proof. But if digital recreation is part of your plan.... more data captured is probably going to make you happier in the long run.

QuoteQuote:
I hope no one thinks that the majority of images taken today will still be around in ten years. The issue isn't whether or not they are viewable on a 5K screen, it is whether or not they'll just be in some sort of format and retrievable.
Do you mean format? I will absolutely guarantee that any jpeg or TIFF will be viewable 20, 30 years from now or more. Maintaining software methods to view jpeg or TIFF is trivial. The problem is making sure they're saved in the first place on something that doesn't die on you or go away when someone declares bankruptcy.

I'm even convinced raw formats like .DNG, .NEF, .PNG etc will be viewable/convertable, mainly because the methods used to convert them don't just disappear, they're well known and easy to carry forward. Even if the major software vendors want to discontinue ongoing support for a format there will always be a conversion method available.
10-20-2014, 12:23 PM   #73
Loyal Site Supporter
Loyal Site Supporter




Join Date: Mar 2009
Location: Gladys, Virginia
Photos: Gallery
Posts: 27,650
QuoteOriginally posted by jsherman999 Quote
In the end those photos could last 100 years if printed with longevity in mind, so it is one of the best ways to future-proof. But if digital recreation is part of your plan.... more data captured is probably going to make you happier in the long run.



Do you mean format? I will absolutely guarantee that any jpeg or TIFF will be viewable 20, 30 years from now or more. Maintaining software methods to view jpeg or TIFF is trivial. The problem is making sure they're saved in the first place on something that doesn't die on you or go away when someone declares bankruptcy.

I'm even convinced raw formats like .DNG, .NEF, .PNG etc will be viewable/convertable, mainly because the methods used to convert them don't just disappear, they're well known and easy to carry forward. Even if the major software vendors want to discontinue ongoing support for a format there will always be a conversion method available.
I guess between hard drive crashes, DVD failures, etc, I am not convinced that people will be able to access their photos in the long run. I know people who had all a years worth of snaps on a couple of memory cards only to have one of them go bad, thereby losing a large percentage of their photos. Probably folks on the forum are a lot more compulsive about backing photos up in multiple ways (hard drives, cloud, blue ray disks), but in general, I think technology has tended to make people's photos more tenuous.

At least in the "old days," people had negatives (which last a long time) and some prints, that maybe their kids would happen on in a shoe box when they were cleaning out a closet, but now there won't be anything but a CD or a DVD that will likely just get pitched in the garbage.
10-20-2014, 12:39 PM   #74
Veteran Member
LensBeginner's Avatar

Join Date: Sep 2014
Photos: Albums
Posts: 4,696
QuoteOriginally posted by Rondec Quote
I guess between hard drive crashes, DVD failures, etc, I am not convinced that people will be able to access their photos in the long run. I know people who had all a years worth of snaps on a couple of memory cards only to have one of them go bad, thereby losing a large percentage of their photos. Probably folks on the forum are a lot more compulsive about backing photos up in multiple ways (hard drives, cloud, blue ray disks), but in general, I think technology has tended to make people's photos more tenuous.
*snip*
Yeah, I guess some of the less tech-savvy individuals around can get a false sense of security from nice, polished interfaces and from this "techy" buzzwords everyone is lobbing around... I'm talking non-digital-natives ones.
I know a couple of them too who still don't have a serious backup plan in practice... the others I know... well, experience is the best teacher, although quite a harsh one... but I bet they won't get burned a second time...
10-20-2014, 12:40 PM   #75
Veteran Member




Join Date: Dec 2007
Photos: Gallery
Posts: 8,237
QuoteOriginally posted by Rondec Quote
I guess between hard drive crashes, DVD failures, etc, I am not convinced that people will be able to access their photos in the long run. I know people who had all a years worth of snaps on a couple of memory cards only to have one of them go bad, thereby losing a large percentage of their photos. Probably folks on the forum are a lot more compulsive about backing photos up in multiple ways (hard drives, cloud, blue ray disks), but in general, I think technology has tended to make people's photos more tenuous.

At least in the "old days," people had negatives (which last a long time) and some prints, that maybe their kids would happen on in a shoe box when they were cleaning out a closet, but now there won't be anything but a CD or a DVD that will likely just get pitched in the garbage.
I think you're right. It's kinda like saving money, a lot of people don't see the need until it's getting too late.

For my part I don't have all the answers either, I'm wondering what external media and cloud vendors to 'bet on', and I'm left with a nagging suspicion that I may be eternally grateful to my wife's parents who have bothered to print 1000's of 4x6's and 5x7's of my family. I obviously don't plan on relying on those prints alone, but they're a good back up plan in case all my high-minded digital archive efforts fail.
Reply

Bookmarks
  • Submit Thread to Facebook Facebook
  • Submit Thread to Twitter Twitter
  • Submit Thread to Digg Digg
Tags - Make this thread easier to find by adding keywords to it!
24x36mm, 50mp full frame, 645z, aps-c, camera, cameras, cards, dng, drives, ff, format, frame sensors, full-frame, hope, images, jpg, k-3 sensor, light, link, memory, mp, pentax, sensor, sony, sony 50mp, tif, website
Thread Tools Search this Thread
Search this Thread:

Advanced Search


Similar Threads
Thread Thread Starter Forum Replies Last Post
Photokina 2014: Ricoh confirm full frame camera is coming in 2015 Mistral75 Pentax News and Rumors 671 01-29-2015 08:45 AM
Full Frame in 2015 fwbigd Pentax Full Frame 90 10-24-2014 04:41 PM
Fuji will release a Full Frame X-PRO 2 in 2015! jogiba Canon, Nikon, Sony, and Other Camera Brands 8 04-06-2014 10:24 AM
Sony Full Frame mirrorless prototypes still being tested, coming in mid 2014 jogiba Canon, Nikon, Sony, and Other Camera Brands 17 02-17-2013 10:31 AM
Sony Mirrorless Full Frame coming Winder Canon, Nikon, Sony, and Other Camera Brands 9 12-21-2012 11:05 PM



All times are GMT -7. The time now is 03:19 PM. | See also: NikonForums.com, CanonForums.com part of our network of photo forums!
  • Red (Default)
  • Green
  • Gray
  • Dark
  • Dark Yellow
  • Dark Blue
  • Old Red
  • Old Green
  • Old Gray
  • Dial-Up Style
Hello! It's great to see you back on the forum! Have you considered joining the community?
register
Creating a FREE ACCOUNT takes under a minute, removes ads, and lets you post! [Dismiss]
Top