Unless you’re underweight, in which case the solution to everything is to eat more. Yes, doctor, I am skinny, but I don’t think has anything to do with me asking to get on minoxidil.
- 2 Posts
- 92 Comments
DaPorkchop_@lemmy.mlto Mildly Interesting@lemmy.world•2.06 meters tall volleyball player Anna Smrek (1.60m man next to her for scale)101·6 days agoHere’s another one for good measure
Xaturday Korning Creakfast Dereal
DaPorkchop_@lemmy.mlto egg_irl — Memes about being trans people in denial and other eggy topics@lemmy.blahaj.zone•[Transfem meme] egg💉irl5·14 days agoI don’t think this is good advice tbh. I tried this myself, after 5 weeks I felt no mental changes (good or bad) and DID get seemingly irreversible boob growth.
DaPorkchop_@lemmy.mlto Technology@lemmy.world•Say Hello to the World's Largest Hard Drive, a Massive 36TB SeagateEnglish6·17 days agoThere are a number of enterprise storage systems optimized specifically for SMR drives. This is targeting actual data centers, not us humble homelabbers masquerading as enterprises.
DaPorkchop_@lemmy.mlto Linux Gaming@lemmy.world•Anyone have trouble with their Beelink?English81·17 days agoFramerate above 20 in what with what settings? That’s kinda key information :P
DaPorkchop_@lemmy.mlto Linux Gaming@lemmy.world•Anyone have trouble with their Beelink?English32·17 days agoYou shouldn’t need to download any graphics drivers, Ubuntu (and pretty much every other distribution) ships with the open-source AMD driver stack by default, which are significantly better than and less hassle than the proprietary drivers for pretty much all purposes. If you’re getting video out it’s almost certainly already using the internal GPU, but if you’re unsure you can open a terminal and run
sudo apt install mesa-utils
and thenglxinfo -B
to double-check what is being used for rendering.
DaPorkchop_@lemmy.mlto Hardware@lemmy.world•Nvidia chips become the first GPUs to fall to Rowhammer bit-flip attacksEnglish11·17 days agoWhile I am no fan of NVIDIA, this headline seems somewhat disingenuous in making it sound like NVIDIA’s fault. They aren’t the ones making the memory chips.
DaPorkchop_@lemmy.mlto Green Energy@slrpnk.net•The Amount of Electricity Generated From Solar Is Suddenly Unbelievable21·17 days agoWhere are these negative prices? I’m in Switzerland and my electricity price just keeps going up.
DaPorkchop_@lemmy.mlto science@lemmy.world•Tiny gut “sponge” bacteria found to flush out toxic PFAS “forever chemicals”English3·17 days agoIsn’t that just passing the PFAS on to whoever ends up getting injected with your donation?
Thinking of a modern GPU as a “graphics processor” is a bit misleading. GPUs haven’t been purely graphics processors for 15 years or so, they’ve morphed into general-purpose parallel compute processors with a few graphics-specific things implemented in hardware as separate components (e.g. rasterization, fragment blending).
Those hardware stages generally take so little time compared to the rest of the graphics pipeline that it normally makes the most sense to have far more silicon dedicated to general-purpose shader cores than the fixed-function graphics hardware. A single rasterizer unit might be able to produce up to 16 shader threads worth of fragments per cycle, so even if your fragment shader is very simple and only takes 8 cycles per pixel, you can keep 8x16 cores busy with only one rasterizer in this example.
The result is that GPUs are basically just a chip packed full of a staggering number of fully programmable floating-point and integer ALUs, with only a little bit of fixed hardware dedicated to graphics squeezed in between. Any application which doesn’t need the graphics stuff and just wants to run a program on thousands of threads in parallel can simply ignore the graphics hardware and stick to the programmable shader cores, and still be able to leverage nearly all of the chip’s computational power. Heck, a growing number of games are bypassing the fixed-function hardware for some parts of rendering (e.g. compositing with compute shaders instead of drawing screen-sized rectangles, etc.) because it’s faster to simply start a bunch of threads and read+write a bunch of pixels in software.
The “B” in “Boot” looks really off, the inside of the big “O” is lighter than the rest of the sign, and the kerning on the bottom text is all over the place.
DaPorkchop_@lemmy.mlto Ask Lemmy@lemmy.world•My child won't stop singing the "Lava Chicken" song from the Minecraft movie. How do I go on living?12·24 days agoThe suggestion and response are both meant humorously. It clearly isn’t actually a good answer because it doesn’t actually solve the problem, except in some passive-agressive far-off-in-the-future way.
DaPorkchop_@lemmy.mlto Ask Lemmy@lemmy.world•How do you mateys and folks deal with the sheer amount of music available out there?3·24 days agoI follow a huge number of artists on Bandcamp which I mostly discovered through Chiptunes=WIN (which unfortunately seems to have erased itself from the internet), and the occasional collab album is enough to keep gradually discovering new artists.
DaPorkchop_@lemmy.mlto Linux@lemmy.ml•GNOME 49 Alpha Released With X11 Support Disabled By Default, Many New Features5·24 days agoKDE user here, I still use X11 to play old Minecraft versions. LWJGL2 uses xrandr to read (and sometimes modify? wtf) display configurations on Linux, and the last few times I’ve tried it on Wayland it kept screwing the whole desktop up.
DaPorkchop_@lemmy.mlto KDE@lemmy.kde.social•KDE devs have been quietly working on Plasma Keyboard, a new on-screen keyboard for desktop and mobile part of the “We Care About Your Input” KDE Goals initiative. Although not ready for texting yet,5·24 days agoIn the future, you can generally solve these sorts of build errors by just installing the development package for whatever library is missing. On Debian-based systems, that would be something along the lines of
sudo apt install libecm<tab><tab>
see what appears, choose one which looks reasonable with-dev
suffix
DaPorkchop_@lemmy.mlto Open Source@lemmy.ml•The Open-Source Software Saving the Internet From AI Bot Scrapers91·24 days agoIt takes like half a second on my Fairphone 3, and the CPU in this thing is absolute dogshit. I also doubt that the power consumption is particularly significant compared to the overhead of parsing, executing and JIT-compiling the 14MiB of JavaScript frameworks on the actual website.
My sister has this printed on a t-shirt.