• 0 Posts
  • 15 Comments
Joined 7 months ago
cake
Cake day: August 30th, 2025

help-circle
  • Yeah a friend of mine got a new job last year and they finish off every sprint by creating a new release and pushing it to prod on Friday afternoon. She was shocked when she found out about that.

    The company doesn’t have a definition of done for the devs and doesn’t include QA in the release cycle. So the devs just push broken unfinished shit into releases. Then when QA points out all of the issues, the devs go crying to management about how they can’t keep to the schedule for the new sprint if QA makes them work “endlessly” on tasks that were “finished” long ago. Basically blaming QA for the issues the devs themselves caused. Management and sales over promise to customers, so they want the devs to work on the new stuff, as they promised it was already in place when development wasn’t even planned let alone started. QA and support need to deal with the fallout, trying to handle customers that got non-working broken shit and are pissed about it. Management just tells support to forward any “troubled” customers to them, as they will “handle” it. But what management does is just over promise again, till the customer is happy, setting up an endless loop of failure and disappointment.



  • I don’t think you have any idea how hard EUV actually is. ASML was told for decades it could never be done and they were throwing money away by trying to make it happen. Even inside the company a lot of folk were against the whole thing, stating it was not possible. If it weren’t for the leadership having stuck by it during development, it would never have been done.

    It took advances in physics, math and engineering to even create the technology, let alone make it reliable, fast and cheap enough to make it usable for mass production purposes. It’s a huge advancement and has a good few years in optimizations and improvements ahead.

    What’s next after EUV? I don’t think anyone really knows, this might be the end of the line as far as shrinking node sizes goes. And we’ll need to look into novel structures and materials. Or who knows, ASML might have something else cooking in a top secret project.



  • Because in most cases they can only do the thing they do, because another company invested millions in order to make, release, promote and support the game. Without their work, the modders would have nothing to mod. Because working out a licensing deal with every modder to split the revenue is a lot of work and most mods won’t get played much anyways, it isn’t worth the hassle. So in order to accommodate the community and keep their game active for longer, the terms are modding is allowed and even encouraged. But the other side of the bargain is that the mods can’t be sold. And usually the company reserves the right to outright ban mods using legal means. For example when people mod in far right extremism the company doesn’t want to get associated with.

    Now there is a gray area where people donate to modders or even pay outright for modders to build certain things. This is usually just fine, as long as the mod is also available for free. People aren’t paying for the mod, they are paying for the dev time, which is totally fine.

    But this modder specifically put access to his mods behind his Patreon. Sure technically you could subscribe for a month, get the mod and then stop the subscription. But that’s legally still a pay wall and in practice the mod needs to be updated often to keep working.

    So it’s pretty simple in this case, the modder was asked to stop putting the mods behind a pay wall, he didn’t, so he got a cease and desist. Usually I’m all for the little man and against the large companies, but in this case the terms were pretty clear and the modder violated them.

    Now we could have a more general discussion about how and if modders should be compensated for their time. But I feel that’s a bit beyond this single case.


  • Alright, I’ve got nothing for you then.

    I didn’t think the thing would be good. When he got it in, we spent a day running benchmarks and fooling around with it. We compared it to his old workstation and my desktop system. It wasn’t a very controlled environment, we were just having fun and putting the thing through it’s paces.

    I asked my friend yesterday how he liked the machine having worked with it for some time now and he’s really happy with it. It is across the board faster than his old machine and is wonderful to work with. He can setup complex simulations and take it with him to the office. This was always a bit of a pain point in the past, where he would run the simulations at home on his workstation, but then could only share the results. Sometimes they would rent server time to run the simulation on a cloud system, but that was a bit of a hassle and had costs. Now he just uplugs his notebook, puts it in the bag and off he goes. He also now doesn’t have 2 systems from work he needs to regularly log into and keep up to date. Sometimes he had a couple of months where he didn’t need the laptop and it would get fussy over missing updates etc. So for him at least it’s a big win and to me shows you can run some pretty heavy stuff on those machines.

    Are there faster machines out there? Absolutely. Are there even better notebooks out there? For sure, Apple M3 is faster and M4 is even faster still. And with Apple the performance per watt is better as well. But running Windows on those is (for now at least) not something that’s suitable for work. The security department would certainly not approve of a highly modified version of Windows.

    The whole point of this post was Arm chips might be huge in the future and I have to agree. These current gen Arm CPUs are impressive and the next gen will be even more so.

    You also seem to indicate running benchmarks and running applications is somehow not the same thing? Sure not all benchmarks are realistic, but are more of an indication and relative performance thing, as to easily compare different systems. And not all applications stress the system the same way. But every benchmark I’ve seen says that notebook is on par and exceeds the performance of my 5950X desktop and to me that’s impressive. In the real world if we are using simple office applications or websites/web-apps, I doubt we would notice the difference in performance, both are equally fast and perhaps the latency of the internet connection is a bigger factor there. But something like Speedometer shows the real world browser performance of the laptop is better than on my desktop.

    Did the engineers at Qualcomm spend a couple of weeks with a small team to optimise a custom Linux environment for Geekbench and put a boatload of cooling on the chip? Sure, I believe that. They want to show the CPU in the best possible conditions. Is the real world performance still very good? Yes, it is. And there are so many notebook reviews that back this up.

    Are there also terrible notebooks with a CPU throttled all the way down and lacking enough cooling? Also yes. But the same can be said for x86 notebooks. Especially Intel notebooks of 12th and 13th gen, those ran hot and slow all the time.

    If you are convinced all Arm notebooks suck, I’m not here to change your mind, I’m not here to provide any kind of proof. All I can tell you is I know of one real life case where I saw with my own eyes the thing was pretty damned good. If you don’t believe me, that’s just fine. It’s just a discussion on the internet, don’t take it too seriously.

    It’s not like anyone can afford a new laptop in 2026, with the RAM prices being what they are. So it probably won’t be the year of the Arm CPU, no matter how good those chips actually are.


  • It’s a work machine, he uses it for work. He runs a custom simulation package from work, I can’t name the app without doxxing my friend. It scales well with CPU and memory and uses an optimal number of threads for the amount of cores (and even works well with stuff like multiple CPUs or different cores having access to different cache). For running most stuff at least 32GB of memory is required and for the stuff he does 64GB of memory is an absolute must. Simulations take between 20 mins and 8 hours depending on what he wants from it. The simulation tool does not use the GPU at all, so that’s a non-factor. The tool is x86 based, with an Arm version coming soon™, so there might even be performance improvements in the future. The simulations run faster in all scenarios as compared to his old workstation, even the long ones. Cooling is not an issue on this particular machine and due to the many core load boosting isn’t done anyways.

    We ran Speedometer because many laptop reviews include that one and it’s very quick and easy to run. Specifically the 3.0 version because we had a source open with Apple M benchmarks that included that one. The result was somewhere around Apple M2, maybe a bit faster than an M2 but def slower than an M3.

    You can try it yourself and read about it here: https://browserbench.org/Speedometer3.0/ It benchmarks regular use in web-apps, as a lot of apps these days are web-apps. So it gives an impression of every-day tasks in websites and web-apps.

    I think the results you mention back up what I said? It regularly outperformed my 5950X desktop machine in benchmarks or was at least on par in other cases. My desktop is a big case with water-cooling and when benching the fans do make a bit of noise. That little notebook outperformed it and the fan was barely noticeable.

    Like I said, I was sceptical, but that thing impressed me a lot. You can draw your own conclusions, that doesn’t really matter to me. If you think Arm laptops suck by definition, that’s fine, you do you. But don’t say you can’t use it for heavy applications, because at least for some cases that’s just not true. I think the GPU (especially the driver) is a weak point for these system, so anything that leans on them probably should use a systems with a separate GPU and not the builtin one. But this is also true for Intel and AMD, so not really any difference there.


  • We have tried a whole bunch of benchmarks and the laptop was on par or faster than the older Threadripper workstation and my 5950X desktop. Most benchmarks were multithreaded, but there was some singlethreaded stuff as well. He uses the system to run simulations for work and that software also runs faster than the old workstation. I can’t run that on my system, so I wouldn’t know how it compares.

    I don’t have the exact bench results as we didn’t write them all down, just ran and compared. But I do have a result screenshotted of 27.9 in Speedometer 3.0, which is pretty good I think.

    As it’s the laptop from work he runs Windows on it, the new Windows Arm version which wasn’t even fully released at the time he got it. That version seems to be a big step up from the old Arm Windows which was used for budget Bing books. His model is the most high end one, 15" with 64GB of memory and 1TB SSD with a Qualcomm Snapdragon X Elite X1E-80-100 cpu. That one has some pretty good cooling inside.

    I was sceptical at first as well, as I would have thought the performance wouldn’t be great and there would be compatibility issues. But he’s been using it for a while now and says everything works just fine. Replacing a big box workstation with a thin and light notebook and have it perform better is pretty wild. There would absolutely be faster systems available, for example a 9950X system or latest gen Threadripper workstation would be faster. But that would have been more expensive because those systems are more expensive to start with and he would then need a separate laptop as well. Having something in a thin laptop form factor and it be an upgrade in performance is pretty mind blowing.


  • Heavy duty desktop applications are excellent on Arm. I know someone who has one of those new Microsoft Surface laptops with an Arm chip and that thing is impressive. It outperforms my Ryzen 5950X desktop in many cases and does it all without breaking a sweat. I had my doubts before if it would work, especially since a lot of x86 stuff goes through an emulation layer. But even in x86 benchmarks that thing is super fast. And with 64GB of memory you can run some really heavy duty stuff. He uses it as a workstation for work and it outperforms the big box he had before, can take it anywhere and was cheaper than the original workstation as well. And with an USB-C hub he can connect his multiple monitors and keyboard, so the setup really hasn’t changed much from before. Only downside is Windows sucks ass and messes up the monitor config when the thing gets disconnected, but he wrote a script to fix it, so not the biggest of deals.


  • You are probably a prosumer, somebody who knows their stuff and doesn’t want an inferior experience like on a tablet, console or notebook. Something upgradable, to invest in and use for many years. That market will certainly exist, but prices will be much higher than they are today.

    I remember back in the day when I bought my 40MB hard drive it was around $3000 in todays money. I had to really save up to get that thing. I labelled the partition “LARGE” because my mind was blown at 40 whole megabytes of storage.

    No idea where we are headed, it’s pretty uncertain at this point.


  • I think we might be seeing the end of consumer desktop computers coming to pass. The market will split into phones and tablet, which most people use for most things traditionally a computer would be used for. Laptops are widely used for more work like stuff. Thin clients connecting to virtual desktops is already the norm in a lot of companies. Desktop computers might go back to where they were in the 80s and early 90s, very expensive high end prosumer equipment. Only for real enthusiasts, who see it as a hobby and want to invest heavily, or professionals that need the local compute power of a workstation. The computer industry was already in big trouble just before covid, then we suddenly all needed work from home setups or spent more time at home so wanted a new better computer, which caused the industry to pick back up. This AI bubble might just kill it finally, with prices skyrocketing, people will be hesitant to buy new hardware for a while.




  • Half Life 2 was released over 20 years ago. It was meant to run on what is now regarded as ancient hardware.

    When Half Life 2 released there was actually a whole lot of grumbling from gamers as the system requirements were very high. It ran like shit or didn’t ran at all unless you had very recent and high end hardware.

    I remember buying a new gpu back then specifically because of HL2. I didn’t have a lot of money, so I bought an Asus 6800 card, which wasn’t powerful enough to run HL2. However with a bit of luck those could be modded and overclocked into an 6800 Ultra which was powerful enough. However it was a lottery whether this was possible and ran without issues. The first card I bought couldn’t do it, so I went to the shop and returned it. Went to another shop and bought one there, which also didn’t work. Then I went over to another town and bought one there which finally worked out. Even though it was a mid-tier card, gpus were expensive back then so it cost me all of the money I’d saved up for a couple of years before.

    HL2 has gotten a lot of optimizations as the years went on, but when it first released it was an example of an unoptimized game when released. And just like these days people were bitching about it.



  • digging through a shoebox of game carts. For someone who wasn’t alive for that era of gaming (not even close, honestly), it’s a neat little glimpse of what it was like.

    As someone who was alive for gaming in the 80s and 90s, it was nothing like that at all. Unless you were very rich, most people would have less than 10 games for the one console they had. It would be a small stack by the side of the console, next to the controllers. Games were usually around $70 depending on the game, which is like $160 in today’s money. NES games were cheaper, especially once the SNES was released. So people did wind up collecting NES games (2nd hand) once the SNES released. The NES moved to the oldest kid bedroom, with the SNES taking the place of the one console in the living room. They might have a shoebox of older games at some point.

    We did play a lot of games tho, often we would borrow games from other kids in the neighbourhood. Although everyone had the same 5 super popular games, but the other games people had varied. Downside was, the easiest ones to borrow were often the ones that weren’t any good. We all know that one kid that had the Star Wars SNES game and hated it, but you’d only very sparingly get a new game, so you were stuck with it.

    Another thing we did was rent a lot of games, you would go to the rental place and they would have so many games, it would blow your mind. They’d have posters up, often large set pieces for some games and movies. It was like kid heaven. Then you’d have about 10 mins to figure out which game to rent, otherwise your dad would get annoyed and tell you to get a move on. People even rented the SNES when it was just released for a weekend, so they would know if it was any good before buying it for the family. It was a big purchase, so you’d better make it worth it.