Deepseek R1 scam? Could China use AI openAI for specific tasks?

I tested R1, both online and locally. I noticed significant differences in quality (although I was only able to run the mid-range model due to capacity constraints).

Just a theory: Could R1 delegate various tasks to openAI in the background and thus achieve these extreme values ​​(level with low power consumption)? Has anyone achieved a similarly high level with a local installation as with the online version?

It would be the perfect plan to crash Chinese tech stocks.

(1 votes)
Loading...

Similar Posts

Subscribe
Notify of
17 Answers
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
EinTyppie
1 month ago

(where, for reasons of capacity, I only have the medium to run).

This answers the question relatively immediately. 70B is only a fraction of 670B, accordingly the quality is also a fraction..because to think that it leads a few things over OpenAI, is relatively stupid.

The ding is open source. Absolutely everything you can see and check.

Such an API consumption for OpenAI would be very expensive. Since OpenAIs API is ultra expensive. by deepseak, about 20 times so much cheaper. This would make OpenAI notice that a user is asking for your MILLIONS on the day, and that would be an absolute loss business. Makes no sense at the front and back

sk8terguy
1 month ago

No. DeepSeek is completely open source, so you can see what the program does. Apart from that, it would be timelessly expensive to transfer data to ChatGPT and let them edit – would also notice quite quickly, especially with open source code.

You use a small local model, which is obviously much less powerful.

fronaldfruck
1 month ago

The local versions are finegetunete other LLMs such as Llama or Qwen who should do as if they were Deepseek

The 70B Llama Model does not come to the 671B R1 Model ran

Eromzak
1 month ago
Reply to  fronaldfruck

That’s just the surface or not? You pull the Deepseek R1 model and not the Llama model right.

fronaldfruck
1 month ago
Reply to  Eromzak

If you have half Terabyte RAM, you can run deepseak

The rest is not derpseek

fronaldfruck
1 month ago
Reply to  geheim007b

They have used ChatGpt data for training so is not surprising

fronaldfruck
1 month ago

It’s not long. but also the mac mini setup

sgt119
1 month ago

This was once, current can be up to 256GB

EinTyppie
1 month ago

At some point, your CPU has reached the limit. Normal consumer CPUs go up to 64-128 GB RAM.

Eromzak
1 month ago

Yes rams get thrown behind, but I just don’t know if the part can be more than 256GB even if I buy dei bars. You’d have to look at that.

Ah ne Memory Controller says 256gb max. rip 671B

fronaldfruck
1 month ago

Then make a few more bars in

Eromzak
1 month ago

No, I’ve got an old server from nem family member standing around here. Threadripper gedöns.

My desktop board has “only 64GB”

fronaldfruck
1 month ago

What system do you have with 250GB RAM? Is that a consumer mainboard?

Eromzak
1 month ago

Just a quarter, unfortunately. But a version smaller doess too.