mastouille.fr est l'un des nombreux serveurs Mastodon indépendants que vous pouvez utiliser pour participer au fédiverse.
Mastouille est une instance Mastodon durable, ouverte, et hébergée en France.

Administré par :

Statistiques du serveur :

1,1K
comptes actifs

#datasette

0 message0 participant0 message aujourd’hui

How I use LLMs – neat tricks with Simon’s LLM command

Earlier this year I co-authored a report about the direct environmental impact of AI, which might give the impression I’m massively anti-AI, because it talks about the signficant social and environmental of using it. I’m not. I’m (slowly) working through the content of the Climate Change AI summer school, and I use it a fair amount in my job. This post shows some examples I use

I’ve got into the habit of running an LLM locally on my machine in the background, having it sit there so I can pipe text or quick local queries into it.

I’m using Ollama, and the Simon Willison’s wonderful llm tool. I use it like this:

llm "My query goes here"

I’m able to continue discussions using the -c flag like so:

llm -c "continue discussion in a existing conversation"

It’s very handy, and because it’s on the command line, I can pipe text into and out of it.

Doing this with multi line queries

Of course, you don’t want to write every query on the command line.

If I have a more complicated query, I now do this:

cat my-longer-query.txt | llm

Or do this, if I want the llm to respond a specific way I can send a system prompt to like so:

cat my-longer-query.txt | llm -s "Reply angrily in ALL CAPS"

Because llm can use multiple models, if I find that the default local (currently llama 3.2) is giving me poor results, I can sub in a different model.

So, let’s say I have my query, and I’m not happy with the response from the local llama 3.2 model. I could then pipe the same output into Claude instead (I’d need an API key and the rest set up, but that’s an exercise left to the reader, as the LLM docs are fantastic)

cat my-longer-query.txt | llm -m claude-3.5-sonnet

Getting the last conversation

Sometimes you want to fetch the last thing you asked an llm, and the response.

llm logs -r

Or maybe the entire conversation:

llm logs -c

In both cases I usually either pipe it into my editor, which has handy markdown preview:

llm logs -c | code -

Or if I want to make the conversation visible to others, the github gh command has a handy way to create a gist in a single CLI invocation.

llm logs -c | gh gist create --filename chat-log.md -

This will return an URL for a publicly accessible secret gist, that I can share with others.

Addendum – putting a handy wrapper around these commands

I have a very simple shell function,ve that opens a temporary file, for me to jot stuff into, and upon save, echoes the content to STDOUT, using cat.

I use the fish shell

This then lets me write queries in editor, which I usually have open, without needing to worry about cleaning up the file I was writing in. Because LLM stores every request and response in a local sqlite database, I’m not worried about needing to keep these files around.

function ve --description "Open temp file in VSCode and output contents when closed"    # Create a temporary file    set tempfile (mktemp)    # Open VSCode and wait for it to close    code --wait $tempfile    # If the file has content, output it and then remove the file    if test -s $tempfile        cat $tempfile        rm $tempfile    else        rm $tempfile        return 1    endend

This lets me do this now for queries:

ve | llm

One liner queries

I’ve also since set up another shortcut like this for quick questions I’d like to see the output from, like so:

function ask-llm --description "Pipe a question into llm and display the output in VS Code"    set -l question $argv    llm $question | code -end

This lets me do this now:

ask-llm "My question that I'd like to ask"

Do you use this all the time?

Not really. I started using Perplexity last year, as my way in to experimenting with Gen AI after hearing friends explain it was a significant improvement on using Search as it enshittifies. I also sometimes use Claude because Artefacts are such a neat feature.

I also experimented with Hugging Face’s Hugging Chat thing, but over time, I’ve got more comfortable using llm.

If I wanted a richer interface than what I use now, I’d probably spend some time using Open Web UI. If was to strategically invest in building a more diverse ecosystem for Gen AI, it’s where I would spend some time. Mozilla, this is where you should be investing time and money if you insist on jamming AI into things.

In my dream world, almost every Gen AI query I make is piped through llm, because that means all the conversations are stored in a local sqlite database that I can do what I like with.

In fact, I’d probably pay an annual fee (preferably to Simon!) to have my llm sqlite database backed up somewhere safe, or accessible from multiple computers, because as I use llm more, it becomes more valuable to me, and the consequences of losing it, or corrupting it in some way become greater.

If you have had success using llm that way, I’d love to hear from you.

Green Web FoundationThinking about using AI? Here's what you can and (probably) can't change about AI's environmental impactThis briefing is intended to help people with a responsibility for AI projects understand the considerations around their direct negative environmental impact arising from AI.
Suite du fil

irish-election-2024.vercel.app

I've been working on this candidate explorer for the upcoming Irish general election. At the moment it contains basic info about who is running where, candidate/constituency/party search and some stats at the party and constituency level. I'm hoping that by election day I'll be able to add a lot more info, and bring in additional datasets.

#irishge
#ireland
#civics
#civichacker
#irishelection
#daileireann
#ge2024
#ge24
#irishpolitics
#datasette

Any civics hackers working on open data or tools for the upcoming Irish general election? (Or otherwise Ireland civics related things)

Let me know, I'd love to see what's out there and potentially contribute.

I'm also working on a candidate directory to try and provide as much info on candidates as possible, so if you could let me know if that already exists that'd be great.

So schwer ist es nicht: Datenbanken für Journalist*innen: So hieß der Workshop bei der Netzwerk Recherche, von @joliyea und @cutterkom.

Im Fokus: @datasette installieren, ausprobieren, Plugins einbinden.

Das Tutorial als Repo auf Github gibt es hier: github.com/br-data/nr24-datase

GitHubGitHub - br-data/nr24-datasette: Workshop bei der Netzwerk Recherche 2024 von @barthelj und @cutterkomWorkshop bei der Netzwerk Recherche 2024 von @barthelj und @cutterkom - br-data/nr24-datasette
A répondu dans un fil de discussion

@luis_in_brief @shauna
The datasette project is actually a great example. docs.datasette.io/en/stable/pl

I believe that the datasette plugin directory is based on a github open graph search on the #datasette-plugin project tag so there isn't a lot of curation involved but it does provide a really good UI/UX and allows self-service inclusion by simply tagging your project on github.

docs.datasette.ioPlugins - Datasette documentation

While collecting data for @pypodcats, I needed a way to easily share my findings with the rest of the team. At first I was gonna email the csv file, but what good would that do? It's data but not meaningful.

With Datasette, I was able to easily publish my data on Heroku, and share the URL with the team, and they'd be able to perform their own SQL queries and gather whatever info they need.

The feature for publishing #Datasette to Heroku is just brilliant. 💯

Thanks @simon for Datasette.

A répondu dans un fil de discussion

Vor'm Radio auf den Lieblingshit warten und rechtzeitig auf Aufnahme drücken...und dann hoffen, dass die Moderator*innen nicht reinlabern.

Nochmal schnell einen Kakao machen, während das Programm von der #Datasette lädt.

Was sind eure Erinnerungen? Nutzt ihr noch #Kassetten?

iv.melmac.space/watch?v=jVoSQP

Techmoan | InvidiousCassettes - better than you don't rememberIn this *Beginners Introduction to Compact Cassette* - I'm using Metal Tapes and Dolby S for the first time to see if a cassette tape really can sound 'almost as good as a CD'. Useful links below - click SHOW MORE to reveal.. eBay UK - Cassette Players http://goo.gl/awzyQR eBay US - Cassette Decks http://goo.gl/sfgahf If you are looking to buy the same model of machine I bought - the model is the Sony TC-S1 - this link will search for it on eBay UK http://goo.gl/leqEIP However the TC-S1 is part of the Sony Scala system and some people list it under this - so this link will look for the Sony Scala on eBay UK http://goo.gl/TUFz82 Whatever eBay search you are doing - it's advisable to expand it to include the rest of Europe - the prices and condition are often better if you source a machine from Germany, Austria, the Netherlands etc. Don't just search for Dolby S - as often this spec is not mentioned in the title of an auction. I *bought* my blank Metal Tapes from here: https://tapeline.info They sold out long ago. Here's the video I mention at the end, about the LAST AUDIO CASSETTE FACTORY: https://youtu.be/CMTpvr9HXeI If my video was too basic for you, don't complain, it's futile (the video has already been made and can't be changed now) - Instead you can head over to TAPEHEADS.NET and talk to the infinitely more knowledgeable people there about any tape related subject in detail http://www.tapeheads.net ------AFFLIATED LINKS/ADVERTISING NOTICE------- All links are Affiliated where possible. When you click on links to various merchants posted here and make a purchase, this can result in me earning a commission. Affiliate programs and affiliations include, but are not limited to, the eBay Partner Network & Amazon. I am a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to AMAZON Sites (including, but not limited to Amazon US/UK/DE/ES/FR/NL/IT/CAN)