Surface Neo Pre-release Discussion Thread

Discussion in 'Microsoft' started by JoeS, Oct 2, 2019.

  1. Bishop

    Bishop Keeper of Odd Knowledge Senior Member

    Messages:
    778
    Likes Received:
    1,098
    Trophy Points:
    156
    Insularity may become their greatest weakness.
     
    nnthemperor likes this.
  2. Bishop

    Bishop Keeper of Odd Knowledge Senior Member

    Messages:
    778
    Likes Received:
    1,098
    Trophy Points:
    156
    Given the current fascination with "fail fast", AGILE methodology and SCRUM meetings, there are so many teams running around like ferrets on meth it'll be a miracle if the right learnings coalesce with the right leaders in time to make the right choices.
     
    nnthemperor likes this.
  3. rabilancia

    rabilancia Pen Pal - Newbie

    Messages:
    46
    Likes Received:
    61
    Trophy Points:
    26
    No argument at all! I seem to recall that a guy named Darwin once said something about survival. :cool:
     
    Bishop likes this.
  4. dstrauss

    dstrauss Comic Relief Senior Member

    Messages:
    10,895
    Likes Received:
    9,446
    Trophy Points:
    331
    More likely the walled garden will become even more attractive since users can play with their favorite iOS apps on the big(ger) screen...sometimes familiarity breeds contempt, but it can also be more tranquil and less disruptive...
     
  5. Bishop

    Bishop Keeper of Odd Knowledge Senior Member

    Messages:
    778
    Likes Received:
    1,098
    Trophy Points:
    156
    What part of technology in the last 40 years hasn't been premised on disruption?
     
  6. dstrauss

    dstrauss Comic Relief Senior Member

    Messages:
    10,895
    Likes Received:
    9,446
    Trophy Points:
    331
    Point taken - but with Intel handing out duds left and right, can you blame a guy for a little stability?
     
    nnthemperor likes this.
  7. Bishop

    Bishop Keeper of Odd Knowledge Senior Member

    Messages:
    778
    Likes Received:
    1,098
    Trophy Points:
    156
    I don't blame you for wishing to have it. I've reconciled myself to the conclusion that stability, like control, is at best temporary and often illusory in the final analysis.
     
    Last edited: Sep 22, 2020
    nnthemperor and dstrauss like this.
  8. sonichedgehog360

    sonichedgehog360 AKA Hifihedgehog Senior Member

    Messages:
    2,189
    Likes Received:
    1,762
    Trophy Points:
    181
    I will strongly contend this point. An instruction set alone does not a powerful processor make. Qualcomm has yet to release a processor that decidedly overtakes the Y-class Core processors in performance. If you were talking about an Apple A series, I would totally agree with you, but ARM is no more a class-leading architecture than x86 is. Yes, once upon a time there was a lot of legacy instruction set bloat that cumbered x86 but that has since been dealt with via micro ops. That is why every modern CISC is actually internally a RISC. And to further complicate things, many modern RISCs including ARM have added CISC-like instructions and use micro ops. So the ARM is inherently better than x86 argument actually does not hold water.

    What truly matters is the implementation of the instruction set, the internal microarchitecture and the manufacturing process it is produced on. Apples and oranges. Or Apples and Qualcomms. Or Intels and AMDs. AMD and Apple are the current leaders in their respective areas because they found the most efficient way to physically and logically implement their target instruction sets. AMD proves, in fact, that x86 still has a lot of traction and flexibility ahead. And before we cheerlead too hard for ARM, let's not forget this talking point: with NVIDIA now buying up ARM, ARM license holders do not want to give away their trade secrets to a chief competitor who holds the cards. I would not put all my apples or eggs in the ARM basket, especially with RISC-V suddenly gaining huge interest from ARM license holders.

    Given the above, it has nothing to do with ARM being better than x86. It is because Intel, the principal maker of laptop processors, continues to slide and surrender ground and just make outrageous blunders. So Microsoft is diversifying in the expectation that Intel will become number two to ARM and AMD. Intel's 7nm is now years behind and they are scrambling to make 10nm work in their poorly managed company. Heck, Intel's graphics driver "eye candy edition" control panel, Graphics Command Center, contrary to what their website claims, is just a repainted version of Graphics Control Panel, only with bugs tacked on. It is one of the "brilliant" (I say this heavy on the sarcasm) projects that Raja Koduri cooked up under his watch.

    Command Center still has a major bug, over a year later, where custom resolutions do not work on secondary displays. Try it. It will generate an error if you try adding a custom resolution to a secondary display. And they have yet to fix it in Command Center. But with it in its broken state, Intel is supposedly going to eventually have Command Center supersede Control Panel and retire Control Panel completely. As someone who actually relies on custom resolutions (for marquees, for example, a very commonplace tech item in stores, arcades, airports, and restaurants), that gives me one more reason to ditch Intel CPUs and to point colleagues to AMD APUs when Intel cannot get their graphics drivers together.
     
    Last edited: Sep 22, 2020
  9. desertlap

    desertlap Scribbler - Standard Member Senior Member

    Messages:
    2,637
    Likes Received:
    3,310
    Trophy Points:
    181
    I will agree with you on two points only, that Nvidia's purchase of ARM does put up a major red flag at the moment for ARM generally though Apple with their A series is likely immune as is Qualcomm generally speaking (and the upcoming 1000 line has a major boost in performance)

    Second, you are correct that both architectures have borrowed from each other in that ARM had gained some pseudo CISC like instructions and that CISC had borrowed things like asymmetric cores most recently in lakefeild (albeit poorly).

    However I stand by my contention that RISC based architecture has a much bigger future capabilities in everything from raw clock speed to power efficiency. And when you remove the incredibly bloated windows OS from the picture and run an optimized os such as linux, it's no contest. But you go ahead and keep cheerleading AMD. They make some solid chips and are winning the overall x86 performance wars for the moment, though not by all that much except maybe with thread ripper which is not a consumer part.

    OTOH AMD graphics chip sets are really impressive and relevant to this argument are ARM/RISC based
     
    Last edited: Sep 23, 2020
    dstrauss likes this.
  10. desertlap

    desertlap Scribbler - Standard Member Senior Member

    Messages:
    2,637
    Likes Received:
    3,310
    Trophy Points:
    181
    So one somewhat sketchy rumor this morning from Asia. It has nothing to do with chip or OS choice, but allegedly the Neo is still being tweaked including hardware. The rumor is that the Neo will be one of the first consumer Mini Led devices.

    One benefit of that, at least in the engineering samples we have looked at is that in comparably sized displays, mini led consumes about 30% less power.

    OTOH one thing that makes me skeptical is that again at least in the engineering samples we've seen, mini led is a bit thicker due to control electronics than the thinnest LCD's we've seen.
     
Loading...

Share This Page