• 0 Posts
  • 21 Comments
Joined 2 years ago
cake
Cake day: August 14th, 2023

help-circle


  • That really depends on what their goal is.

    From a business perspective it’s not worth fighting to eliminate 100% of ad block uses. The investment is too high. But if they can eliminate 50% or 70% or 90% of ad block uses with youtube? That could be worth the effort for them. If they can “win” for Chrome and make it a bit annoying for Firefox that would likely be enough for Google to declare it a huge success.

    People willing to really dig all the way in to get a solution they desire are not the norm. Google can be OK with the 1% of us out there as long as we aren’t also making it possible for another huge chunk of people to piggyback off it effortlessly.


  • I don’t think Kotick is at all certain to be kicked out. As easily as I can see MS letting him go with an enormous golden parachute, I can just as easily imagine them keeping him onboard because all they care about is Activision’s ability to make money.

    In all likelihood Blizzard isn’t going to be managed any differently. Microsoft’s modus operandi with gaming acquisitions is to leave the leadership in place and let the dev/publisher run itself. Why is everyone expecting different here? The most likely outcome is MS does nothing to Blizzard and Blizzard continues on more or less the same trajectory as before.


  • It’s also because their current shows suck, and because any shows that are actually good get shitcanned after season 2, because Netflix sees less consumer growth after two seasons.

    I’m always surprised at how often other people (not you) will defend this practice from Netflix. It’s classic case of following the data in a stupid way. If their data shows that interest drops off after two seasons, I don’t doubt it.

    But… that comes with a cost. They have built a reputation as a company that doesn’t properly finish shows that they start, that will leave viewers hanging. That makes it harder to get people invested in a new series, even one that’s well reviewed. Why get interested in something you know will end on a cliffhanger?

    That kind of secondary order impact from their decision isn’t going to show up in data. Doesn’t change that it happens all the same.





  • I’m planning to upgrade from a 12 mini, which partly influenced my choice of years too (having seen 3 year data was the main part!). If I had a 12 Pro I think I’d have kept it for an extra year, but the battery is just not sufficient for how my phone use has changed.

    I think furthering your extra details here too is I saw someone point out that one of Apple’s slides for the base 15 was comparing its performance to the base 12. Apple knows how often people upgrade. Picking the 12 as a comparison point wouldn’t be an accident — we’re the single largest target audience for the 15. And in a year, they will in all likelihood compare the 16 to the 13 for the same reason.


  • This year’s new phones are for people that last bought a phone in 2020 or earlier. If the average user is on a three year upgrade cycle (what the data shows as I recall) then you’d expect roughly 1/3 of people to upgrade every year.

    This is better for Apple, as it keeps their revenue more spread out instead of heavily concentrated in year one of a three year cycle.
    This is better for consumers, as it means new features and upgrades are constantly being made. If they want to upgrade early they can, and they’ll get new features even if it’s only been two years.
    This is also better for both Apple and consumers because there’s more opportunities to course-correct or respond to feedback over issues. If Apple only released a phone every other or every three years, it’d take that much longer for the switch to USB-C.

    Just because a new product is launched does not mean you need to buy it. Nvidia released a new GPU last year, but I didn’t buy it even though it’s newer than what I currently have. Arguing that new phones shouldn’t come out each year is like arguing that new cars shouldn’t come out each year. It makes no sense.


  • That depends on how long FCC is able to keep it implemented for, IMO.

    Something that gets lost a lot in policy discussion is that once you implement a business regulatory policy like this, you create a constituency for that policy. It’s an advantage in preserving hard fought gains but that also means the timelines need to work for it. The problem net neutrality faced the first time is that it was (a) late in Obama’s presidency, (b) held up by court cases, and (c) reversed early on by Trump’s FCC. There wasn’t much time for the internet business community to build a business model around it.

    If net neutrality is regulated into existence for 5+ years, at that point businesses will have come to rely on its existence. Taking it away will be harder, especially for a big pro-business party if it’s getting an earful from megacorporations that want things to stay as they are.

    Of course, I do agree that legislating it is the most robust option and would be the best course of action. I just don’t see legislation as the only option with any longevity. FCC rules can be that if the timelines work.




  • The practical performance differences between N3B and N3E should be more or less immaterial to the end user. N3E just has a lower defect rate, meaning a greater portion of chips will be valid when made under that process versus made under N3B. There was a fairly credible rumor a few weeks ago that Apple was paying TSMC per valid chip instead of the industry standard per wafer. So for us, the end users, the cost won’t even be passed down — that’s just a cost that TSMC has to bear.

    That said, if you don’t need a new phone now, waiting is good in general. Whatever is out today, they’ll have something better next year. Wait as long as you’re willing and able between upgrades. Unless you’re absolutely loaded with money, I guess.


  • In this case it’s not truly a result of limited fab availability.

    TSMC has two main variants of their 3nm node. The original one, that Apple is using, is N3B. It has worse yields, so TSMC started work on another variant, N3E. N3E has much better yields but will not be ready until late 2023 or early 2024. Everyone else besides Apple opted to skip N3B and go for N3E. Apple, with their very consistent release cadence, didn’t want to wait for N3E. So Apple — and only Apple — is using N3B.

    Thus, we have:
    (1) TSMC only has one 3nm node in 2023: N3B.
    (2) TSMC only has one customer for N3B: Apple.
    (3) TSMC will never have any other customer use N3B, and have no incentives to build capacity beyond what is needed now.

    It’s effectively tautological that their entire 3nm allocation will be sold exclusively to Apple in 2023.


  • It’s especially egregious with high end GPUs. Anyone paying >$500 for a GPU is someone that wants to enable ray tracing, let alone at a $1000. I don’t get what AMD is thinking at these price points.

    FSR being an open feature is great in many ways but long-term its hardware agnostic approach is harming AMD. They need hardware accelerated upscaling like Nvidia and even Intel. Give it some stupid name similar name (Enhanced FSR or whatever) and make it use the same software hooks so that both versions can run off the same game functions (similar to what Intel did with XeSS).


  • I agree, it’s just strange from a business perspective too. Obviously the people in charge of AMD feel that this is the correct course of action, but they’ve been losing ground for years and years in the GPU space. At least as an outside observer this approach is not serving them well for GPU. Pricing more aggressively today will hurt their margins temporarily but with such a mindshare dominated market they need to start to grow their marketshare early. They need people to use their shit and realize it’s fine. They did it with CPUs…


  • GPU prices being affordable is definitely not a priority of AMD’s. They price everything to be barely competitive with the Nvidia equivalent. 10-15% cheaper for comparable raster performance but far worse RT performance and no DLSS.

    Which is odd because back when AMD was in a similar performance deficit on the CPU front (Zen 1, Zen+, and Zen 2), AMD had absolutely no qualms or (public) reservations about pricing their CPUs where they needed to be. They were the value kings on that front, which is exactly what they needed to be at the time. They need that with GPUs and just refuse to go there. They follow Nvidia’s pricing lead.