When Samsung announced the Galaxy Note 2, with its 5.5" 1280x720 HD Super AMOLED display, I assumed it was a Pentile display. But it seems that the Note 2 actually uses an RGB matrix in a unique arrangement (see the photo below). Samsung calls this new matrix S-Stripe. This is rather confusing on several accounts - mostly because up till now Samsung used the brand Super AMOLED Plus for non-pentile OLEDs.
Just a few weeks ago we explained that Pentile OLED displays enable higher lifetime, and we were told that for an RGB OLED with over 230 PPI, lifetime becomes too low for Samsung and they choose Pentile in those displays. But the Note II has a PPI of 267 - the highest PPI non-Pentile OLED. This means it has a lower lifetime compared to a Pentile display (but the advantage is that there's no visible Pentile pattern of course).
It seems likely that Samsung managed to increase the lifetime of their OLED displays, and so were able to ditch the Pentile here. It's possible that Samsung have finally started to use green PHOLED emitter (which improves the lifetime over the green fluorescent emitter used in current displays), and perhaps they have also moved to a better fluorescent blue emitter.
Comments
I was wondering the same thing, regarding Ignis and their alternative subpixel layout. Looking forward to seeing the display in person. Having said that I must add that I have no complaints with the display in my Galaxy S3. It is a beautiful display. When I first heard about Pentile it concerned me but the displays I have seen have all been impressive. I give Samsung credit for their continued investment and R&D in OLED's.
mkf
Could be I guess, for sure the two look similiar. Then again in the pictures of the Samsung display it looks as if the red and the green subpixels are the same size. In the Ignis layout the red is quite a bit larger than the green. So who knows...
As for the whole pentile discussion....personally I think it is overblown. My personal opinion is that 99% of all potential customers probably would never notice any difference between a pentile and a non-pentile display in every day applications unless you explicitly told them to look for one. The only reason this is disucssed so much is probably because the remaining 1% are tech geeks like us who usually write the earliest reviews of these devices ;)
Thanks for the comments guys. This is indeed interesting and using this scheme may help explain how they managed to achieve good lifetime at 267 PPI. I posted about it here.
In pen-tile matrix RGBG, does green is shared between two pixel or each pixel has RGBG pattern, and when ppi is refered does the sub-pixels are taken into account?
In Pentile the green is "shared", yes, you can see that in the image at the top of the post. When people say PPI they mean pixel per inch, not sub-pixel.
Ok so all of this talk in this and other articles got me thinking. Before the pentile discussion began to get heated all over the web, everyone was concerned about their phones' PPI. But obviously once the pentile displays came out and veered away from the typical RGB stripe (maybe there have been other display types b4 the pentile but that’s kind of irrelevant to the point I am going to make) there became a whole ‘nother thing to consider when it came to display clarity. IMHO this clarity comes down to the smallest part that makes up the picture, and that, for all intents and purposes, is the sub-pixel, not the pixel itself.
Now personally when the Note 2 was 1st announced I was pissed that Samsung decided to make the screen bigger and lower the pixel count because I currently have the S3 with Sprint and for me personally, the bigger the screen the better because I use my phone for TV and movies all the time. Now I understood that they wanted to have a 16:9 aspect ratio, but then why not just increase the horizontal, or both, resolution, not decrease the vertical! Especially since it was not a SAMOLED+, but just a regular SAMOLED which obviously meant another pentile right? But then I started reading into the screen and found out that even though it does not follow the typical RGB stripe, it does in fact have all 3 red, green & blue sub-pixels in each pixel instead of sharing pixels like the pentile does, which is shown very clearly in this picture (removed).
So this got me curious because as I stated before, at least to my eyes, the sub-pixel count seems to have a greater affect on image quality than just the pixel count itself, which to me was proven when I got my Sprint Galaxy S2 and even though it had the same number of pixels with a .52" larger screen than my original Galaxy S, the screen clarity on my S2 was noticeably better which, if you dont consider the sub-pixel difference, should not be true. So I decided to do a comparison between what I am going to call the "SPPI" (Sub-Pixels Per Inch) of the original Note, the S3, and the Note 2 as I own a S3 and as much as I would love to have a larger screen, I really would not be happy with the huge loss of quality that would normally come when you stretch the same amount of pixels by .7" as the S2 has a 4.8" screen and the Note 2's display is 5.5", and what I found out really surprised me.
So here's the math of how I got to my conclusions. 1st of all according to the information from Samsung that can be found here: "http://www.oled-info.com/files/images/amoled-vs-super-amoled-plus.jpg" the sub-pixel count on the SAMOLED WVGA screen is 768,000 and for the SAMOLED+ WVGA screen it is 1,152,000. Now a WVGA screen is 800x480, which comes to 384,000 total pixels, the Note 1 is 1280x800 which = 1,024,000 and the S3 and Note 2 both are 1280x720 equaling 921,600. Now at this point I needed to figure out the number of sub-pixels on each phone by doing a simple algebraic comparison of the number of pixels to sub-pixels between the WVGA screen and each one of the other phones, which is done here:
Note 1: 384,000/768,000=1,024,000/x and x = 2,048,000 sub-pixels
Galaxy S3: 384,000/768,000=921,600/x and x = 1,843,200 sub-pixels
Note 2: 384,000/1,152,000=921,600/x and x= 2,764,800 sub-pixels
Now at his point I needed to take the aspect ratio of each phone, which is 16:10 for the 1st Note and 16:9 for the other 2 phones, and use the formula on this website to get the horizontal and vertical “sub-pixel resolution”: http://www.wikihow.com/Calculate-a-Digital-Camera's-Resolution-from-its…”. In doing this I came up with the following resolutions:
Note 1: 1,810 x 1,131
Galaxy S3: 1,810 x 1,018
Note 2: 2,217 x 1,247
I then put that information in with the screen sizes in this website: “...” and came up with the following “SPPI” results:
Note 1: 402.7 “SPPI”
Galaxy S3: 432.6 “SPPI”
Note 2: 462.5 “SPPI”
So in a similar scenario as the Galaxy S vs. S2 screen clarity comparison, as long as my calculations are correct, theoretically not only should the Note 2 have a better clarity than the original one despite having a larger screen and lower resolution, it should also be clearer than the Galaxy S3! I have to say I was shocked at that and very much look forward to actually holding one in my hand to compare against my S3. I’d like to hear some opinions as to whether in general people agree or disagree with what I came up with here and for anyone that has held the S3 and Note 2 side by side if these calculations hold up in the real world.
At first I thought note2 screen is much better than note1 then I saw this video and I don't understand why the note2 is blurry. http://www.youtube.com/watch?v=ealqvz196M0 3:40
Yea, good job.
Just a small correction - Note 2 screen is 5.55 inch diagonal (not 5.5 as you used in calculation).
To add, you can get the number of sub-pixels easier than what you did.
Note1: 1280*800*2=2,048,000 sub-pixels
Note2: 1280*720*3=2,764,800 sub-pixels
GS3: 1280*720*2=1,843,200 sub-pixels
No they base ppi on perceived pixels. So rg is one pixel and big is another. But with the note 2 it has 3 sub pixels per pixel which is awesome. I hope it comes to verizon and I hope the gpu beats the droif razor Max HD. If it does or even matches it in graphical performance im on it.
Were you serious here? The Note 2 uses an overclocked Mali, while the Razr Max uses an Adreno 225. The GSIII (International) crushes any dual-core S4. The Adreno 320 will probably be faster than the Note 2's Mali.
depends on how the graphics unit controls the image though... I would agree 100% if sub pixel control were used (like in sharp aquous tvs), but if the functional unit in the mind of the gpu is the whole pixel, then the issue is complicated... the pentile could have small pixels (only two sub pixels each) but colour representation on the pixel level would be limited due to missing subpixel colour range... this is compensated for by adjacent pixels...
it also depends on how much blank space there is between pixels (some screens give a screen door affect, where you can see a black grid outlining the pixels)... but at the end of the day the brain dose a lot of averaging anyways, so the number results wont always translate into apparent performance.
All that said, I appreciate what you mean by measure the subpixels, and I agree that there needs to be a more standardized way to convey this spec if it is something that will be marketted... Just like it would be nice for a standard set of benchmarks to be included (problem is, youd get companies designing for benchmarks and not performance... and everyone loses)...
Does that mean Samsung is using IGNIS technology?