

If only it had a snapdragon 8 elite instead of a mediatek chip
Alternate account for @simple@lemmy.world
If only it had a snapdragon 8 elite instead of a mediatek chip
If anything I wish competition in the laptop space would be fiercer. We know Nvidia and AMD are making ARM chips but it might be a while, all we have now are Snapdragon and they aren’t very good.
DONT USE UNETBOOTIN. This tool more often than not breaks something and causes issues for people. Somebody I know used it and broke booting into Windows, he had to use a USB anyway to fix the bootloader.
Xiaomi, Oppo, Vivo, and OnePlus, are exploring the possibility of developing versions of the Android operating system that do not rely on Google Mobile Services (GMS).
Clickbait title. They’re rumored to fork Android which will still just be Android without the Google suite.
FYI it fails to fetch votes if you don’t include https://
at the start.
if not potentially better, than this one.
Let’s not get too crazy, there are some graphics mods but Oblivion will still look like a PS3 game at best
July 25, 2023
Yes, I use it often and it works great now.
A little early, isn’t it? I heard Cosmic is still relatively unstable since it’s still in beta.
That’s awesome! I don’t know what sparked more people to come over though. Would be great if someone can fill me in since I’m so out of the loop.
Also, PSA to all new users on Desktop: Try Alexandrite - a gorgeous front-end for Lemmy https://alexandrite.app/lemm.ee/
Well, emators are mostly CPU-bound and the Snapdragon 8 Elite is a surprisingly powerful chip. It’s doable with a flagship phone.
Wait, why is Fedora making their own flatpaks? I thought the entire point is that they work on any distro and everybody gets the original source from flathub.
I understand it well. It’s still relevant to mention that you can run the distilled models on consumer hardware if you really care about privacy. 8GB+ VRAM isn’t crazy, especially if you have a ton of unified memory on macbooks or some Windows laptops releasing this year that have 64+GB unified memory. There are also websites re-hosting various versions of Deepseek like Huggingface hosting the 32B model which is good enough for most people.
Instead, the article is written like there is literally no way to use Deepseek privately, which is literally wrong.
DeepSeek is open source, meaning you can modify code(new window) on your own app to create an independent — and more secure — version. This has led some to hope that a more privacy-friendly version of DeepSeek could be developed. However, using DeepSeek in its current form — as it exists today, hosted in China — comes with serious risks for anyone concerned about their most sensitive, private information.
Any model trained or operated on DeepSeek’s servers is still subject to Chinese data laws, meaning that the Chinese government can demand access at any time.
What??? Whoever wrote this sounds like he has 0 understanding of how it works. There is no “more privacy-friendly version” that could be developed, the models are already out and you can run the entire model 100% locally. That’s as privacy-friendly as it gets.
“Any model trained or operated on DeepSeek’s servers are still subject to Chinese data laws”
Operated, yes. Trained, no. The model is MIT licensed, China has nothing on you when you run it yourself. I expect better from a company whose whole business is on privacy.
HuggingChat is open source and lets you use DeepSeek.
Very misleading, it lets you use the lighter, watered-down version (Deepseek 32B) compared to the large impressive model they have (Deepseek 671B)
Making a dumb tweet doesn’t make you a fascist and doesn’t invalidate the years of hard work people put into a non-profit swiss company, you should get over yourself.
Every algorithm is just sorting. Facebook sorts which posts should be shown on top and which should be pushed down. Posts that have a bad rank aren’t shown to many people - and eventually - none at all.
algorithm free feed
Sorting by new?
Look up LM Studio. It’s a free software that let’s you easily install and use local LLMs. Note that you need to have a good graphics card and a lot of RAM for it to be useful.
From what I understand Discord’s screensharing was fixed on the Canary version (which is the beta version of Discord) but not yet in the stable package. Try installing that and trying it out.
Discuit isn’t part of the fediverse