Pov: when you overthink too much
This graph might be showing us the timeline to get to AGI
Extended OpenAI Image Query is Next Level
I’m sorry, but I can’t be the only one disappointed by this…
Welp. It happened.
Rumoured GPT-4 architecture: simplified visualisation
The prompt that every LLM gets wrong
Business owner tend to be more happier than employees
Is langchain overhyped?
Last year, LLM's size was decreasing while keeping quality(eg. Mistral 7b), but this year it seems like the trend is reversing towards bigger size LLM with the latest release of Grok and Databricks's DBRX
People who are making 300k+/year working for themselves, what do you do?
An open-source 132B param foundation model by Databricks
OpenAI is still dominating the LLM space, but google is also catching up
Mistral-7B-v0.2 has been uploaded to HF
Is it TRUE that intellectual, PhDs, and deep technology won't help one to build a successful business?
No we don't
Grok-1 converted to PyTorch fp16 (638GB lol)
Reverse engineering Perplexity
A year ago vs today at Nvidia's Event
Real-time object detection in the browser using transformer js
What are your expectations about AGI?
Is it time to cancel the GPT4 plan?
Netflix is running out of ideas...
New LLM announced from NVIDIA: Nemotron-4 15B. Trained on 8T tokens using 3,072 H100s. Training took 13 days. (Model not yet available)