• 0 Posts
  • 75 Comments
Joined 2 years ago
cake
Cake day: July 5th, 2023

help-circle
  • The value a thing creates is only part of whether the investment into it is worth it.

    It’s entirely possible that all of the money that is going into the AI bubble will create value that will ultimately benefit someone else, and that those who initially invested in it will have nothing to show for it.

    In the late 90’s, U.S. regulatory reform around telecom prepared everyone for an explosion of investment in hard infrastructure assets around telecommunications: cell phones were starting to become a thing, consumer internet held a ton of promise. So telecom companies started digging trenches and laying fiber, at enormous expense to themselves. Most ended up in bankruptcy, and the actual assets eventually became owned by those who later bought those assets for pennies on the dollar, in bankruptcy auctions.

    Some companies owned fiber routes that they didn’t even bother using, and in the early 2000’s there was a shitload of dark fiber scattered throughout the United States. Eventually the bandwidth needs of near universal broadband gave that old fiber some use. But the companies that built it had already collapsed.

    If today’s AI companies can’t actually turn a profit, they’re going to be forced to sell off their expensive data at some point. Maybe someone else can make money with it. But the life cycle of this tech is much shorter than the telecom infrastructure I was describing earlier, so a stale LLM might very well become worthless within years. Or it’s only a stepping stone towards a distilled model that costs a fraction to run.

    So as an investment case, I’m not seeing a compelling case for investing in AI today. Even if you agree that it will provide value, it doesn’t make sense to invest $10 to get $1 of value.


  • Intel is best thought of as two businesses, where their historical dominance in one (actually fabricating semiconductors) protected their dominance in another (designing logic chips), despite not actually being the best at that.

    Intel’s fabs represented the cutting edge in semiconductor manufacturing, and their superiority in that business almost killed AMD, who just couldn’t keep up. Eventually, AMD decided they wouldn’t try to keep up with cutting edge semiconductor manufacturing, and spun off their fabs as an independent company called Global Foundries in 2009.

    But Intel hit a wall in progressing in semiconductor manufacturing, and made very slow progress with a new type of transistor known as a finFET, with lots of roadblocks and challenges. The biggest delays came around Intel’s 10nm process, where they never got yields quite to where they should have been, while other foundries like Samsung and TSMC passed them up. And so their actual CPU business suffered because AMD, now a fabless chip designer, could go all in on TSMC’s more advanced processes. Plus because they were fabless, they pioneered advanced packaging for “chiplet” designs where different pieces of silicon could be connected in a way that they acted like a single chip, but where the different components could be small enough that imperfections wouldn’t hurt yield as badly, and where they could mix and match the cheap processes and the expensive processes to the part of the “chip” that actually needed the performance and precision.

    Meanwhile, Apple was competing with Qualcomm and Samsung in the mobile System on a Chip (SoC) systems for phones, and developed its own silicon expertise. Eventually, they were able to scale up performance (with TSMC’s help) to make a competitive laptop chip based on the principles of their mobile chip design (and then eventually desktop chips). That allowed them to stop buying Intel chips, and switch to their own designs, manufactured by TSMC. Qualcomm is also attempting to get into the laptop/small PC market by scaling up their mobile chip designs, also manufactured by TSMC.

    Intel can get things right if it catches up with or surpasses TSMC in the next paradigm of semiconductor manufacturing. The transistors are changing from finFET (where TSMC has utter dominance) to GAAFET (where Intel, TSMC, and Samsung are all jockeying for position), and are trying out backside power (where the transistor gates are powered from underneath rather than from the cluttered top side). Intel has basically gone all in on their 18A process, and in a sense it’s a bit of a clean slate in their competition with TSMC (and to a lesser degree, Samsung, and a new company out of Japan named Rapidus), and possibly even with Chinese companies like SMIC.

    But there are negative external signs. Intel acknowledged that they don’t have a lot of outside customers signing up for foundry services, so they’re not exactly poaching any clients from TSMC. And if that’s happening while TSMC is making absurd profits, that must mean that those potential clients who have seen Intel’s tech under NDA might see that Intel is falling further behind from TSMC. At that point, Intel will struggle to compete on logic chips (CPUs against AMD and Apple and maybe Qualcomm, discrete GPUs against AMD and NVIDIA), if they’re all just paying TSMC to make the chips for them.

    So I don’t think all of their layoffs make a ton of sense, but understand that they’re really trying to retake the lead on fabrication, with everything else a lesser priority.


  • GamingChairModel@lemmy.worldtomemes@lemmy.worldApple
    link
    fedilink
    arrow-up
    15
    arrow-down
    2
    ·
    9 days ago

    Apple’s discounting strategy is generally to sell last year’s model, sometimes the model before that, with roughly $200 discounts for each year since its release. They sometimes release a lower spec model (the 16e is the current example, prior SE models or even the mini models from previous generations were part of this strategy as well) and that sometimes means the 2-year-old model isn’t kept available as long.

    That’s where their 5-7 year support window really shines, in that they can just sell older models as discounted models, knowing that the new owner will still get 3-5 years of support.

    The other thing is that the used market for iPhones is pretty robust. I can go buy used phones that are 3 or 4 years old and still get a good 1-4 years of additional support. At least in the U.S., if you told me my budget for a phone was gonna be $300 for the next 2 years, I think I’d probably buy a used iPhone.

    As it currently stands, I’m still on Pixels on a 2 year cycle, but I also know that my “sell used to offset the price of my new phone” strategy also would be much cheaper if I did it with iPhones instead of Pixels.


  • The sun loses 130 billion tons of matter in solar wind every day.

    But how much can be caught?

    From the sun, the angular diameter of the earth (12,756 km wide, 149,000,000 km away) is something like 0.004905 degrees (or 0.294 arc minutes or 17.66 arc seconds).

    Imagining a circle the size of earth, at the distance of the earth, catching all of the solar wind, we’re still looking at something that is about 127.8 x 10^6 square kilometers. A sphere the size of the Earth’s average distance to the sun would be about 279.0 x 10^15 square km in total surface area. So oversimplifying with an assumption that the solar wind is uniformly distributed, an earth-sized solar wind catcher would only get about 4.58 x 10^−10 of the solar wind.

    Taking your 130 billion tons number, that means this earth-sized solar wind catcher could catch about 59.5 tons per day of matter, almost all of which is hydrogen and helium, and where the heavier elements still tend to be lower on the periodic table. Even if we could theoretically use all of it, would that truly be enough to meet humanity’s mining needs?


  • It’s like the relationship between mathematics and accounting. Sure, almost everything accountants do involve math in some way, but it’s relatively simple math that is a tiny subset of what all of mathematics is about, and the actual study of math doesn’t really touch on the principles of accounting.

    Computer science is a theoretical discipline that can be studied without computers. It’s about complexity theory and algorithms and data structures and the mathematical/logical foundations of computing. Actual practical programming work doesn’t really touch on that, although many people are aware of those concepts and might keep them in the back of their mind while coding.


  • People who get downvoted a lot end up with a ‘low reputation’ indicator next to their name. You’ll know it when you see it.

    Upvotes in meme communities do not add to reputation.

    I think any kind of reputation score should be community specific. There are users whose commenting style fits one community but not another, and their overall reputation should be understood in the context of which communities actually like them rather than some kind of global average.








  • Porn-related transactions have a higher than average rate of chargebacks. Maybe post-nut clarity motivates people to say “wait hold on I shouldn’t have spent that money, I must’ve been hacked.” Or maybe it’s people saving face when confronted with a transaction log from their spouse or other family members. Or maybe it’s just the type of transaction that actual card fraudsters gravitate towards, so that there really is a higher percentage of unauthorized transactions.

    Gambling-related merchants also have a similar problem with payment processors. For many of them, it’s just straightforward business concerns, not any kind of ethical issue in itself.


  • From a business perspective it makes sense, to throw all the rendering to the devices to save cost.

    Not just to save cost. It’s basically OS-agnostic from the user’s point of view. The web app works fine in desktop Linux, MacOS, or Windows. In other words, when I’m on Linux I can have a solid user experience on apps that were designed by people who have never thought about Linux in their life.

    Meanwhile, porting native programs between OSes often means someone’s gotta maintain the libraries that call the right desktop/windowing APIs and behavior between each version of Windows, MacOS, and the windowing systems of Linux, not all of which always work in expected or consistent ways.


  • MacBook seamless suspend/sleep performance is like 25% of why my personal daily driver is MacOS. Another 50% is battery life, of which their sleep/suspend management plays a part. I’ve played around with Linux on Apple hardware but it’s just never quite been there on power management or sleep/wake functionality. Which is mostly Apple’s fault for poor documentation and support for other OS’s, but it just is, and I got sick of fighting it.






  • And while information itself can be a “product” or be provided as a service, in most cases, it’s not.

    Sure, but my point is that the same is true of physical machines. People don’t want working machines for the sake of working machines. They want working machines to actually do something else, to output a “product” of that machine’s operation.

    And viewed in that way, information services are as much a standalone “product” as maintenance/repair services. Information services account for trillions of dollars of economic activity for a reason.