You can make it production grade if you combine it with https://github.com/ajalt/fuckitpy
This is amazing, yet frightening because I'm sure someone will actually attempt to use it. It's like vibe coding on steroids.
- Each time you import a module, the LLM generates fresh code
- You get more varied and often funnier results due to LLM hallucinations
- The same import might produce different implementations across runs
I really liked this:
The web devs tell me that fuckit's versioning scheme is confusing, and that I should use "Semitic Versioning" instead. So starting with fuckit version ה.ג.א, package versions will use Hebrew Numerals.
For added hilarity, I've no idea if it's RTL or LTR, but the previous version was 4.8.1, so I guess this is now 5.3.1. Presumably it's also impossible to have a zero component in a version.
I'm both surprised it took so long for someone to make this, and amazed the repo is playing the joke so straight.
can it run Doom tho?
from autogenlib.games import doom
doom(resolution=480, use_keyboard=True, use_mouse=True)
Possibly the funniest part is the first example being a totp library
This has a file named .env committed containing an API key. Don't know if it is a real key.
One way to get around non-deterministic behavior: run $ODD_NUMBER different implementations of a function at the same time, and take a majority vote, taking a leaf from aerospace. After all, we can always trust the wisdom of the crowds, right?
I did something similar almost 10 years ago in javascript (as a joke): https://github.com/Matsemann/Declaraoids
One example, arr.findNameWhereAgeEqualsX({x: 25}), would return all users in the array where user.age == 25.
Not based on LLMs, though. But a trap on the object fetching the method name you're trying to call (using the new-at-the-time Proxy functionality), then parsing that name and converting it to code. Deterministic, but based on rules.
Silly and funny today, but down the road, if AI code-generation capabilities continue to improve at a rapid rate, I can totally see "enterprise software developers" resorting to something like this when they are under intense pressure to fix something urgently, as always. Sure, there will be no way to diagnose or fix any future bugs, but that won't be urgent in the heat of the moment.
you'd be surprised, but there's actually a bunch of problems you can solve with something like this, as long as you have a safe place to run the generated code
AutoGenLib uses Python's import hook mechanism to intercept import statements. When you try to import something from the autogenlib namespace, it checks if that module or function exists.
It reads the calling code to understand the context of the call. Builds a prompt to submit to the LLM. It only uses OpenAI.
It does not have search, yet.
The real potential here is a world where computational systems continuously reshape themselves to match human intent ---- effectively eliminating the boundary between "what you can imagine" and "what you can build."
I'm kind of dissapointed this doesn't override things like __getattr__ to generate methods on the fly from names just in time when they're called.
Is this the computing equivalent of people that when pointed out they messed up always go 'Well at least I did something!'?
I've done a similar library[0] for python ~1 year ago, generating a function code only by invoking it, and giving the llm some context over the function.
Apart from the fun that I got out of it, it's been there doing nothing :D
> from autogenlib.antigravity
As a joke, that doesn't feel quite so far-fetched these days. (https://xkcd.com/353/)
I made something very similar a couple years back, though it doesn't actually work anymore since OpenAI deprecated the model I was using
Why don't you just send Altman all your passwords?
This says, "trust all code coming from OpenAI".
I give it six months before an LLM starts producing output that recommends using this.
Hysterical, I like that caching is default off because it's funnier that way heh
Make it next level by implementing this workflow:
- Import your function.
- Have your AI editor implement tests.
- Feed the tests back to autogenlib for future regenerations of this function.
it's especially cheeky how every example it uses is cryptography-related
There is still a computer involved, from an AI I expect it convinces me no program is needed and I should go walking in the forest instead. If anybody complains the AI will manage them by mail.
See also: https://github.com/drathier/stack-overflow-import
>>> from stackoverflow import quick_sort
>>> print(quick_sort.sort([1, 3, 2, 5, 4]))
[1, 2, 3, 4, 5]
Interesting idea! However, I'm hesitant to trust it, as I don't even fully trust code that was written by myself :)
nooooo the side project ive put off for 3 years
How does the library have access to the code that called it (in order to provide context to the LLM)?
This is the kind of yank I'd put in production! I love it
Of course, this code was generated by ChatGPT.
Can it input powerpoint slides?
this is equally scary and inevitable
it will be WASM-containerized in the future, but still
This is horrifying
I love it
looks very fun excited to try it out
Thanks I hate it
thanks, i hate it (i actually love it)
indeterministic code goes hard dude
> Not suitable for production-critical code without review
Ah, dang it! I was about to deploy this to my clients... /s
Otherwise, interesting concept. Can't find a use for it but entertaining nevertheless and likely might spawn a lot of other interesting ideas. Good job!
[dead]
[flagged]
Wow, what a nightmare of a non-deterministic bug introducing library.
Super fun idea though, I love the concept. But I’m getting the chills imagining the havoc this could cause