• 1 Post
  • 80 Comments
Joined 3 years ago
cake
Cake day: January 17th, 2022

help-circle
  • No doubt, the kernel itself is also quite complex… but my comment here is on the user experience perspective, namely, for me at least “it just works”. So I’m not trying to imply it will work for anybody flawlessly nor that it’s due to the simplicity of the stack, solely that it works, for me.


  • I’d argue… Alpine?

    Why? Well, because it’s small. So Alpine isn’t the programming distribution itself but rather the distribution for the container your run whatever you build inside of just because it’s very VERY small (like… 5MB?!).

    Obviously that makes sense only in some cases. For example for a frontend Web developer or a game developer (or a WebXR dev like me) it might not help much but otherwise,… maybe?

    Anyway if you are into this kind of things check also Gitpod, it’s about wrapping your dev environment inside a container then having it anytime, anywhere, including for other developers and facilitate their onboarding.


  • HP Laser 107w, driverless, over LAN.

    I just Ctrl+P from any software and it prints.

    It also prints programmatically (for e.g. folk.computer ) thanks to IPP.

    I didn’t have to “think about printing” since I have that setup so I don’t know where you get that sentiment.


  • As per usual, in order to understand what it means we need to see :

    • performance benchmark (A100 level? H100? B100? GB200 setups?)
    • energy consumption (A100 performance level and H100 lower watt? the other way around?)
    • networking scalability (how many cards cards can be interconnected for distributed compute? NVLink equivalents?)
    • software stack (e.g can it run CUDA and if not what alternatives can be used?)
    • yield (how many die are usable, i.e. can it be commercially viable or is it R&D still?)
    • price (which regardless of possible subsidies would come from yield)
    • volume (how many cards can actually be bought, also dependent on yield)

    Still interesting to read after announcements, as per usual, and especially who will actually manufacture them at scale (SMIC? TSMC?).



  • It’s a classic BigTech marketing trick. They are the only one able to build “it” and it doesn’t matter if we like “it” or not because “it” is coming.

    I believed in this BS for longer than I care to admit. I though “Oh yes, that’s progress” so of course it will come, it must come. It’s also very complex so nobody else but such large entities with so much resources can do it.

    Then… you start to encounter more and more vaporware. Grandiose announcement and when you try the result you can’t help but be disappointed. You compare what was promised with the result, think it’s cool, kind of, shrug, and move on with your day. It happens again, and again. Sometimes you see something really impressive, you dig and realize it’s a partnership with a startup or a university doing the actual research. The more time passes, the more you realize that all BigTech do it, across technologies. You also realize that your artist friend did something just as cool and as open-source. Their version does not look polished but it works. You find a KickStarter about a product that is genuinely novel (say Oculus DK1) and has no link (initially) with BigTech…

    You finally realize, year after year, you have been brain washed to believe only BigTech can do it. It’s false. It’s self serving BS to both prevent you from building and depend on them.

    You can build, we can build and we can build better.

    Can we build AGI? Maybe. Can they build AGI? They sure want us to believe it but they have lied through their teeth before so until they do deliver, they can NOT.

    TL;DR: BigTech is not as powerful as they claim to be and they benefit from the hype, in this AI hype cycle and otherwise. They can’t be trusted.





  • Hmmm very interesting thanks for the links and explanation!

    I’m not “ready” for it yet so I’ve bookmarked all that (by adding a file in ~/Apps ;) but that’s definitely and interesting, and arguably neater solution.

    Honestly I try to stick to the distribution package manager as much as I can (apt on Debian stable) but sometimes it’s impossible. Getting binaries myself feels a bit “wrong” but usually works. Some, like yt-dlp as I see in your list, do have their own update mechanisms. Interesting to consider stepping back and consider the trade off. Anyway now thanks to you I know there are solutions for a middle ground!



  • I did more than 5 installs this weekend (for … reasons) and the “trick” IMHO is …

    Do NOT install things ahead of actually needing them. (of course this assume things take minutes to install and thus you will have connectivity)

    For me it meant Firefox was top of the list, VLC or Steam (thus NVIDIA driver) second, vim as I had to edit crontab, etc.

    Quite a few are important to me but NOT urgent, e.g Cura (for 3D printer) and OpenSCAD (for parametric design) or Blender. So I didn’t event install them yet.

    So IMHO as other suggested docker/docker-compose but only for backend.

    Now… if you really want a reproducible desktop install : NixOS. You declare your setup rather than apt install -y and “hope” it will work out. Honestly I was tempted but as install a fresh Debian takes me 1h and I do it maybe once a year, at most, no need for me (yet).


  • I… agree but isn’t then contradicting your previous point that innovation will come from large companies if they only try to secure monopolies rather than genuinely innovate? I don’t understand from that perspective who is left to innovate if it’s neither research (focusing on publishing, even though having the actual novel insight and verifying that it does work), not the large companies… and startups don’t get the funding either. Sorry if you mentioned it but I’m now confused as what is left.


  • They just provide the data. They can question the methodology or even provide another report with a different methodology but if the data is correct (namely no fabricated) then it’s not up to them to see how it’s being used. The user can decide how they define startup, i.e which minimum size, funding types, funding rounds, etc. Sharing their opinion on the startup landscape is unprofessional IMHO. They are of course free to do so but to me it doesn’t question the validity of the original report.



  • Research happens through university, absolutely, and selling products at scale through large companies, but that’s not innovation. Innovation is bringing new products, that is often the result of research yes, to market. Large companies tends to be innovative by buying startups. If there are no startups coming from research coming from universities to buy, I don’t see how large companies, often stuck in the “innovator dilemma”, will be able to innovate.


  • Thanks for linking to criticism but can you highlight which numbers are off? I can see things about ByteDance, Ant group, Shein but that’s irrelevant as it’s not about the number of past success, solely about the number of new funded startups. Same as the CEO of ITJUZI sharing his opinion, that’s not a number.

    Edit: looks totally off, e.g “restaurants, in a single location, such as one city, you could immediately tell that there were large numbers of new companies.” as the article is about funding, not a loan from the bank at the corner of the street.



  • Thanks for the in depth clarification and sharing your perspective.

    this is a good development

    Keeping finance in check is indeed important so I also think it’s good.

    What about the number of funded startups though and the innovative products they would normally provide customers? Do you believe the measures taken will only weed out bad financiers or will it also have, as a side effect, to bring less products and solutions out? Does it mean research will remain academic but won’t necessarily be commercialized or even scaled? If you believe it will still happen, how? Through state or regional funding and if so can you please share such examples that grew for the last 5 years?