Samsung Galaxy Book discussion thread

Discussion in 'Samsung' started by JoeS, Feb 26, 2017.

  1. ammarr

    ammarr Pen Pal - Newbie

    Messages:
    31
    Likes Received:
    10
    Trophy Points:
    16
    I have had the GB10 for a few days now and have had headaches and extreme eye strain after using it for ~2 hrs on 3 of those days. Could just be the flu that's going around, but that wouldn't explain the eye strain.

    I hadn't made the connection until I saw your post because I've never had issues with eye strain after using any device. Would be pretty annoying if this became a consistent problem, because the tablet seems very solid otherwise.

    Edit: that seems to be from the 12" review and mine is the 10" so wouldn't have the same display.
     
    DRTigerlilly and vagabond95 like this.
  2. Bronsky

    Bronsky Wait and Hope. Senior Member

    Messages:
    7,690
    Likes Received:
    4,147
    Trophy Points:
    331
    They do not. The 12" has the OLED display while the 10.6" has a TFT/IPS display.
     
  3. Shizaru

    Shizaru Scribbler - Standard Member

    Messages:
    556
    Likes Received:
    392
    Trophy Points:
    76
    The information was generic and no distinction was made between models. Afaik the socket architecture is the same so up grading the cpu wont require any alterations to be made to the existing board. I'm not sure I understand why you think upgrading the GPU is a necessity if the CPU is upgraded?

    I agree with respect to the margins. If these companies want to hit key price points at the expense of providing key features there is a risk that they will lose sales. Someone at Samsung obviously draws a line at a point where they assume they can hit a target that returns a high enough sales volume to make a production run profitable. People are generally willing to pay more for devices that deliver a degree of future proofing, which generally involves hardware that at least meets the requirements of currently available technology. In this case Samsung should provide a full spec. USB-C port. The fact that the GPU isn't the highest available maybe less critical, given the fact that there is a dGPU on the device, which may meet the needs of the average buyer? This is a productivity device not a gaming machine so graphics demands are somewhat more modest.

    From a personal perspective I would rather pay the price needed to purchase a machine that is as future proof as possible. Which in this case would include the better GPU. I can't really complain about the performance of the Gen7 N9P I have, or the compromises I accepted came with it. The short period of time between the release with a Gen7 and the announcement of the Gen8 is obviously a factor early adopters couldn't have foreseen. I don't think it's unreasonable for someone buying a model with a Gen8 to expect the specifications of supporting hardware to be improved accordingly. I was seriously considering buying the Gen8 machine when it's available, but I'm not so sure about that now, if the USB-C and GPU remain the same. The only factor swaying me is the fact that I could probably recover virtually what I paid for my current device, given the fact that the N9P isn't available in the UK. I could make use of the additional cores of a Gen8 if I had them, but I can get stuff done without them.
     
  4. dstrauss

    dstrauss Comic Relief Senior Member

    Messages:
    9,755
    Likes Received:
    7,724
    Trophy Points:
    331
    The CPU and dGPU are usually tuned for one another, and in this instance, the existing dGPU of the 7th gen model on the 15" Notebook 9 Pro is a AMD Radeon 540 GPU; wouldn't make sense to try to couple that 8th gen quad processor to a low end AMD, when everyone else (HP, Acer, etc.) are using the MX150 with it.

    Despite my love for all things WEMR, I snagged a HP Spectre x360 15 with the 8th Gen Kaby Lake and MX150 - may be here by the end of the month (in time to see the Microsoft WOA iPad Pro note taking dream machine we have dreamed of - hint hint Panos - get with it).
     
    Last edited: Oct 5, 2017
  5. Bronsky

    Bronsky Wait and Hope. Senior Member

    Messages:
    7,690
    Likes Received:
    4,147
    Trophy Points:
    331
    If they can use the same motherboard, leaving the same DGPU would be a really cheap way to upgrade. You would think, however, that with a device such as this, the MX150 would be the choice.
     
  6. dstrauss

    dstrauss Comic Relief Senior Member

    Messages:
    9,755
    Likes Received:
    7,724
    Trophy Points:
    331
    I don't know about this U series processor, but PCWorld ran an article saying the desktop 6 core unit can't use the current motherboards despite using the exact same socket...
     
    Bronsky likes this.
  7. Mytabletacct

    Mytabletacct Pen Pal - Newbie

    Messages:
    27
    Likes Received:
    16
    Trophy Points:
    6
    I was aware of this possibility. I have two tiny holes where I stuck the probes but it’s still possible I didn’t dig deep enough.
     
    Bronsky likes this.
  8. Shizaru

    Shizaru Scribbler - Standard Member

    Messages:
    556
    Likes Received:
    392
    Trophy Points:
    76
    Unfortunately judging from what little information I could find on the upgrade it doesn't appear as though sense is a significant factor in the decision to push out the Gen8 on the N9P line. I will be more than happy for them to prove me wrong!

    I hope you enjoy your X360 and all the goodness it brings with it. :)

    I have a feeling your initial point is going to be the case, despite the fact that bumping up the graphics to an MX150 would be the obvious move to make.
     
    dstrauss and Bronsky like this.
  9. thatcomicsguy

    thatcomicsguy Pen Pro - Senior Member Senior Member

    Messages:
    3,230
    Likes Received:
    2,280
    Trophy Points:
    231
    One way to test for PWM on a device is to shoot a video of the screen. If it flickers, then it uses PWM.

    Here's an example of the Galaxy Book 12" where this is apparent:



    In other videos of the same device, there is no flicker apparent. That means the user probably had the brightness turned up to 100%, where the LED array is getting an uninterrupted power flow, and thus no flicker.

    This isn't ideal. My Samsung Notebook 9 Pro also uses PWM for brightness control, and it also causes eye strain. The solution is to keep the screen at full brightness, but to change the screen colour and gamma settings. (Essentially, make the white pixels grey, using the LCD itself as a set of sunglasses.)

    This can be achieved through a variety of freeware programs specifically written to help with this problem. -I don't bother with them, preferring to just use the built in graphics card settings. On my computer, it's easy to set up different colour profiles and switch between them. (I also like to knock down the blue and turn up the red a bit, which has health benefits).

    Windows 10 also has a "night mode" in the settings panel, where it will change the red/blue levels with a slider. -It doesn't change brightness, but it's a neat little tool nonetheless.

    In any case, this approach is not an ideal solution; you don't get the power savings you do from real dimming (either PWM or linear current control), and the dimmed colour profile, even if it's not bad, is not going to be good enough for photographers and other people for whom colour accuracy is very important.

    But it does mean you can use your device in low light settings without burning out your eye sockets due to high brightness, and also without giving yourself a headache with PWM. (There are other health risks involved with staring at a sub-perceptible flickering light beyond just headaches.)
     
    Last edited: Oct 5, 2017
    Nest likes this.
  10. Azzart

    Azzart Late night illustrator Senior Member

    Messages:
    2,261
    Likes Received:
    1,571
    Trophy Points:
    181

    I don't think it's that easy.
    You won't see flicker occurring in my Galaxy Book 12 review, at worst you'll see the phone camera auto adjusting when the light from the screen changes intensity and I can assure you I never used the display brightness at 100%, it was probably at 50 or below.


    Too much depends on the camera that's recording the video and the environmental light. Also, there is the re-encoding of the video if you mount it using a video editing software and on top of this youtube adds its own compression on the video reducing the quality further.
    Even more, the same camera can give you very different results: I have three videos online all three taken with the iphone 6+ and all three show the display differently. ;)
     
    Nest, WillAdams and thatcomicsguy like this.
Loading...

Share This Page