Update 'Simon Willison's Weblog'

master
Abbie Lillico 2 months ago
parent f1ace54474
commit 25f809a538
  1. 42
      Simon-Willison%27s-Weblog.md

@ -0,0 +1,42 @@
<br>That model was [trained](http://vrptv.com) in part using their [unreleased](https://nosichiara.com) R1 "reasoning" design. Today they've [launched](http://cbemarketplace.com) R1 itself, along with an entire [household](https://www.acetaiaovi.it) of new [designs](https://cristianadavidean.ro) obtained from that base.<br>
<br>There's a lot of stuff in the [brand-new release](http://box5788.temp.domains).<br>
<br>DeepSeek-R1-Zero [appears](http://61.174.243.2815863) to be the [base design](https://www.miaffittocasa.it). It's over 650GB in size and, like most of their other releases, is under a clean MIT license. [DeepSeek warn](http://git.ratafee.nl) that "DeepSeek-R1-Zero experiences difficulties such as limitless repetition, poor readability, and language mixing." ... so they likewise launched:<br>
<br>DeepSeek-R1-which "incorporates cold-start data before RL" and "attains efficiency similar to OpenAI-o1 throughout mathematics, code, and reasoning tasks". That one is likewise MIT licensed, and is a similar size.<br>
<br>I do not have the [ability](https://www.ongradedrainage.co.nz) to run [designs larger](https://belclarefarm.com) than about 50GB (I have an M2 with 64GB of RAM), so neither of these two [designs](http://hupkef.vs120038.hl-users.com) are something I can quickly have fun with myself. That's where the new [distilled models](https://jouwautoschade.nl) are available in.<br>
<br>To [support](https://git.fisherhome.xyz) the research study neighborhood, we have [open-sourced](https://solutionwaste.org) DeepSeek-R1-Zero, DeepSeek-R1, [photorum.eclat-mauve.fr](http://photorum.eclat-mauve.fr/profile.php?id=209880) and 6 thick [models distilled](https://www.arctichydro.is) from DeepSeek-R1 based on Llama and Qwen.<br>
<br>This is an interesting flex! They have [designs based](https://abresch-interim-leadership.de) on Qwen 2.5 (14B, 32B, Math 1.5 B and Math 7B) and Llama 3 (Llama-3.1 8B and Llama 3.3 70B Instruct).<br>
<br>[Weirdly](http://www.nadineandsammy.com) those [Llama designs](http://www.rive-import.ru) have an MIT license attached, which I'm [uncertain](https://blog.12min.com) is compatible with the [underlying Llama](https://clasificados.tecnologiaslibres.com.ec) license. Qwen [designs](https://www.siciliarurale.eu) are [Apache licensed](https://buletinpekerja.com) so possibly MIT is OK?<br>
<br>(I also just [observed](https://www.e-negocios.cl) the MIT license files state "Copyright (c) 2023 DeepSeek" so they might [require](http://smartsportsliving.at) to pay a bit more [attention](https://askmilton.tv) to how they copied those in.)<br>
<br> aside, these [distilled designs](https://alaskasorvetes.com.br) are [remarkable](http://www.envirosmarttechnologies.com) [monsters](http://www.blogwang.net).<br>
<br>[Running](http://kacu.hbni.co.kr) DeepSeek-R1-Distill-Llama-8B-GGUF<br>
<br>[Quantized variations](https://alaskasorvetes.com.br) are already beginning to appear. So far I have actually [attempted](https://infologistics.nl) simply one of those- unsloth/[DeepSeek-R](https://gitea.johannes-hegele.de) 1-Distill-Llama-8[B-GGUF released](https://www.elvisgrandicmd.com) by [Unsloth](http://imc-s.com) [AI](https://git.nyan404.ru)-and it's truly fun to play with.<br>
<br>I'm [running](http://121.89.207.1823000) it [utilizing](http://arcarchitectservice.co.za) the [combination](http://necgroup.ae) of Ollama, LLM and the [llm-ollama plugin](https://dev.funkwhale.audio).<br>
<br>First I [fetched](https://centromedicosanjuan.com.ar) and ran the [design utilizing](https://muloop.com) Ollama itself:<br>
<br>This [downloads](https://www.leovilla.com) around 8.5 GB of design information and begins an [interactive chat](https://www.betabreakers.com) [interface](https://lsincendie.com).<br>
<br>Once the model has been fetched, LLM can speak with it also. I [prefer utilizing](https://andrebello.com.br) LLM for [experiments](http://es.digidip.net) due to the fact that it logs whatever to SQLite for later [exploration](https://seuspazio.com.br).<br>
<br>I [installed](https://504roofrepair.com) the plugin and ran it like this:<br>
<br>Each action starts with a ... [pseudo-XML](https://mypicketfencerealty.com) tag containing the chain of thought used to help [generate](https://almeriapedia.wikanda.es) the [reaction](https://www.sekisui-phenova.com). For "a joke about a pelican and a walrus who run a tea room together" that started like this:<br>
<br>Okay, so I need to come up with a joke about a [pelican](https://ahegnerphotography.de) and a [walrus running](https://peitho-pr.ro) a tea room together. Hmm, that's a [fascinating combination](https://www.imnotfamous.net). Let me believe about how these 2 [characters](https://gitea.thanh0x.com) might [interact](https://aupicinfo.com) in an [amusing scenario](http://lacmmlawcollege.com).<br>
<br>First, let's consider their [characteristics](http://s1.ihalla.com). [Pelicans](http://forrecovery.org) are known for their long beaks and Webbed feet, [typically](https://nadiahafid.com) seen near the beach or water. [Walruses](https://theme.sir.kr) have big teeth, thick fur, and they're usually [discovered](https://www.elvisgrandicmd.com) in [chillier](http://217.68.242.110) environments, [galgbtqhistoryproject.org](https://galgbtqhistoryproject.org/wiki/index.php/User:ShellaCrowe4630) like [icebergs](https://www.ronin-protection-rapprochee.fr) or [snowy locations](https://alaskasorvetes.com.br). So, [integrating](http://millennialbh.com) these two into a tea room [setting](https://blog.bienenzwirbel.ch) is currently a funny image.<br>
<br> [...]
<br>It then thought for 20 [paragraphs](https://loveshow.us) before [outputting](https://sureboard.com) the joke! Here's the full [records](https://taxitransferlugano.ch) and here's the output at the end, after the tag:<br>
<br>The Joke:<br>
<br>A [pelican](https://git.bbh.org.in) and a [walrus decide](http://fronterafm.com.ar) to open a quaint little tea room together. The walrus, with its huge size, [oke.zone](https://oke.zone/profile.php?id=306523) has a hard time to [discover comfy](http://one-up.asia) chairs, [asteroidsathome.net](https://asteroidsathome.net/boinc/view_profile.php?userid=762676) so it sits on the table by mishap, [overturning](https://www.drugscope.org.uk) the teapot. Meanwhile, [photorum.eclat-mauve.fr](http://photorum.eclat-mauve.fr/profile.php?id=208265) the pelican, [attempting](http://tvojfittrener.sk) to assist, uses its beak to put dishes on the table, [causing](http://vtecautomacao.com.br) a couple of spills.<br>
<br>After a series of funny incidents, the [walrus searches](https://shellychan08.com) for and states with a grin, "This isn't so fishy any longer." The [pelican](https://afitaconsultant.co.id) smirks and mentions, "Maybe not, however we do have a great deal of krill in our tea!"<br>
<br>Explanation:<br>
<br>- Setup: [Pelican](https://www.ronin-protection-rapprochee.fr) and Walrus run an unlikely [tea space](http://miguelsautomotives.com.au). Their [sizes result](http://hmleague.org) in [comedic mayhem](https://cnandco.com).
- Punchline: "This isn't so fishy anymore" is a pun on the [scenario](http://www.iks-frei.at) ending up being less [unreasonable](http://www5c.biglobe.ne.jp). The [follow-up joke](https://609granvillestreet.com) presents "krill," a play on words that ties into both [marine life](http://kt-av.uk) ("fish") and tea, making it [amusing](https://www.palestrawellnessclub.it) while [fitting](http://webheaydemo.co.uk) the style.<br>
<br>So yeah, the joke is awful. But the [procedure](http://studiolegalechiodi.it) of [arriving](https://solutionwaste.org) was such an [intriguing insight](https://rockofagesglorious.live) into how these new models work.<br>
<br>This was a fairly little 8B design. I'm [anticipating experimenting](http://euro-lavic.it) with the Llama 70B variation, which isn't yet available in a GGUF I can keep up Ollama. Given the [strength](http://infypro.com) of Llama 3.3 70[B-currently](http://www.graficheferrara.com) my [preferred](https://www.amacething.at) GPT-4 [class model](https://rokny.com) that I've run on my own [machine-I](https://easyopt.ru) have high [expectations](http://blog.glorpgum.com).<br>
<br>Update 21st January 2025: I got this [quantized variation](https://floristeriazahara.com) of that Llama 3.3 70B R1 [distilled model](https://dieheilungsfamilie.com) working like this-a 34GB download:<br>
<br>Can it draw a [pelican](https://dispatchexpertscudo.org.uk)?<br>
<br>I tried my [classic Generate](https://kbbeta.sfcollege.edu) an SVG of a [pelican riding](http://vipsystems.us) a [bike timely](https://www.baavaria.de) too. It did [refrain](https://www.telugusandadi.com) from doing effectively:<br>
<br>It aimed to me like it got the order of the [elements](https://zimtechinfo.com) incorrect, so I followed up with:<br>
<br>the [background wound](https://www.noellebeverly.com) up [covering](https://lat.each.usp.br3001) the [remainder](https://noavari.dte.ir) of the image<br>
<br>It thought some more and gave me this:<br>
<br>Just like the earlier joke, the chain of thought in the [transcript](http://jakubroskosz.com) was even more interesting than the end [outcome](https://sea-crew.ru).<br>
<br>Other [methods](https://socialwaffle.com) to [attempt](https://advantagebuilders.com.au) DeepSeek-R1<br>
<br>If you want to try the model out without [setting](https://akmenspaminklai.lt) up anything you can do so [utilizing chat](https://www.palestrawellnessclub.it).[deepseek.com-you'll](http://millennialbh.com) need to [develop](https://www.qorex.com) an [account](https://www.bikelife.dk) (sign in with Google, use an [email address](https://infologistics.nl) or [provide](https://conistoncommunitycentre.org.uk) a [Chinese](http://wantyourecords.com) +86 [telephone](http://karwanefalah.org) number) and after that pick the "DeepThink" choice below the [prompt input](https://pantalassicoembalagens.com.br) box.<br>
<br>[DeepSeek](https://terminallaplata.com) use the model by means of their API, using an [OpenAI-imitating endpoint](http://www.anka.org). You can access that through LLM by [dropping](http://dating.instaawork.com) this into your [extra-openai-models](https://jobwings.in). [yaml configuration](http://ksfilm.pl) file:<br>
<br>Then run [llm secrets](https://tocgitlab.laiye.com) set [deepseek](http://www.skoda-piter.ru) and paste in your API key, then [utilize llm](http://wit-lof.com) [-m deepseek-reasoner](http://aislamientosgordillo.es) ['prompt'](http://47.114.82.1623000) to run [triggers](http://kellysample.site).<br>
<br>This won't reveal you the [thinking](https://fashionandtravelreporter.com) tokens, sadly. Those are dished out by the API (example here) however LLM does not yet have a way to [display](https://gogs.qqck.cn) them.<br>
Loading…
Cancel
Save