151 points by bundie 2 weeks ago | 129 comments
wiseowise 2 weeks ago
And now everyone is rewriting everything in Go/Rust left and right.
cameronh90 2 weeks ago
I'd argue Rust and Go are even easier to work with than Python/JS/TS. The package management is better, and static linked native binaries eliminate so many deployment headaches.
fragmede 2 weeks ago
metaltyphoon 2 weeks ago
supriyo-biswas 2 weeks ago
saagarjha 2 weeks ago
metaltyphoon 2 weeks ago
steveklabnik 2 weeks ago
Go used to try and let you do it, but has walked back those implementations after all the bugs they've caused, in my understanding.
fragmede 2 weeks ago
pjmlp 2 weeks ago
Sometimes people really need to follow Yoda and Mr Miyagi advices, instead of jumping right into it.
cameronh90 2 weeks ago
Go and Rust prove you can get most of the benefit of C/C++ without paying that complexity cost.
pjmlp 2 weeks ago
jvanderbot 2 weeks ago
diggan 2 weeks ago
Especially interesting for software that are 99.9% of the time waiting for inference to come back to you. Sure, makes sense to rewrite something that is heavily relying on the CPU, or where you want an easier time to deal with concurrency, but I feel like that's not what makes Codex takes long time to finish a prompt.
With that said, I also rewrote my local LLM agent software to Rust, as it's easier to deal with concurrency compared to my initial Python prototype. That it compiles into a neat binary is an additional benefit, but could have as easily gone with Go instead of Rust.
gwynforthewyn 2 weeks ago
In a different domain, I’ve seen a cli tool that requests an oauth token in Python be rewritten to rust and have a huge performance boost. The rust version had requested a token and presented it back to the app in a few milliseconds, but it took Python about five seconds just to load the modules the oauth vendor recommends.
That’s a huge performance boost, never mind how much simpler it is to distribute a compiled binary.
emporas 2 weeks ago
Even if a GC'ed language like Go is very fast at allocating/deallocating memory, Rust has no need to allocate/deallocate some amount of memory in the first place. The programmer gives the compiler the tools to optimize memory management, and machines are better at optimizing memory than humans. (Some kinds of optimizations anyway.)
nasretdinov 2 weeks ago
rybosome 2 weeks ago
Module import cost is enormous, and while you can do lots of cute tricks to defer it from startup time in a long-running app because Python is highly dynamic, for one-time CLI operations that don’t run a daemon or something there’s just nothing you can do.
I really enjoy Python as a language and an ecosystem, and feel it very much has its place…which is absolutely not anywhere that performance matters.
EDIT: and there’s a viable alternative. Python is the ML language.
teaearlgraycold 2 weeks ago
dayjah 2 weeks ago
I agree it’s hell. But I’ve not found many comprehensive packaging solutions that aren’t gnarly in some way.
IMHO the Python Packaging community have done an excellent job of producing tools to make packaging easy for folks, especially if you’re using GitHub actions. Check out: https://github.com/pypa/cibuildwheel
Pypa have an extensive list of GitHub actions for various use cases.
I think most of us end up in the “pure hell” because we read the docs on how to build a package instead of using the tools the experts created to hide the chaos. A bit like building a deb by hand is a lot harder than using the tools which do it for you.
teaearlgraycold 2 weeks ago
QuadmasterXLII 2 weeks ago
teaearlgraycold 2 weeks ago
crabmusket 2 weeks ago
bxdhxhxh 2 weeks ago
In my opinion, bundling the application Payload would be sufficient for interpreted languages like python and JavaScript
dweekly 2 weeks ago
BrouteMinou 2 weeks ago
apwell23 2 weeks ago
jeremyloy_wt 2 weeks ago
apwell23 2 weeks ago
"The existing code base makes certain assumptions -- specifically, it assumes that there is automatic garbage collection -- and that pretty much limited our choices."
tonyhart7 2 weeks ago
this is next Trends
mbb70 2 weeks ago
An announcement of Codex CLI being rewritten in C++ would be met with confusion and derision.
energy123 2 weeks ago
Why would you say this for Rust in particular?
falcor84 2 weeks ago
csomar 2 weeks ago
wrsh07 2 weeks ago
energy123 2 weeks ago
koakuma-chan 2 weeks ago
The comment you linked is talking about unspecified application's runtime errors.
ralusek 2 weeks ago
When I have a smallish application, with tests, written in one language, letting an LLM convert those files into another language is the single task I'm most comfortable handing over almost entirely. Especially when I review the tests and all tests in the new language are passing.
lmm 2 weeks ago
pjmlp 2 weeks ago
Note that most of these rewrites wouldn't be needed if the JIT language would be Java, C#, Common Lisp, Dart, Scheme, Raket.
And all of that list also have AOT compilers, and JIT cache tooling.
NitpickLawyer 2 weeks ago
If this catches on, and more tools get the "chatgpt, translate this into rust, make it go brrr" treatment, hopefully someone puts in the time & money to take tauri that extra 10-20% left to make it an universal electron replacement. Tauri is great, but still has some pain points here and there.
h1fra 2 weeks ago
gofreddygo 2 weeks ago
What could it be that drives the ROI ?
gschier 2 weeks ago
rane 2 weeks ago
miki123211 2 weeks ago
CSMastermind 2 weeks ago
amazingamazing 2 weeks ago
Seems like confirmation bias.
crop_rotation 2 weeks ago
serverlessmania 2 weeks ago
Claude Code tends to write meaningless tests just to get them to pass—like checking if 1 + 1 = 2—and somehow considers that a job well done.
smohare 2 weeks ago
adsharma 2 weeks ago
If it's not possible today, what are the challenges and where does a human need to step in and correct the model?
csomar 2 weeks ago
adsharma 2 weeks ago
pjmlp 2 weeks ago
Waiting for Show HN: AbrasionNext, the framework evolution for frontend devs, with SaaS cloud deployment.
wiseowise 2 weeks ago
ogoffart 2 weeks ago
Trying to bring that with Slint: https://slint.rs
csomar 2 weeks ago
This is not happening. The new folks have moved to SPA/RSC and a RoR type framework doesn't make much sense in that context.
spiderice 2 weeks ago
tonyhart7 2 weeks ago
its already HERE https://loco.rs/
I writing a production level app right now with it
satvikpendem 2 weeks ago
I did this for several projects, works great with much lower costs and compute/memory usage.
gavinray 2 weeks ago
It's a CLI tool that makes API calls. I'd bet my bottom dollar that the performance difference between API-wrapping CLI tools in something like Ruby/Python vs Rust/C++ is negligible in perceived experience.
If they wanted people to not have a dependency on Node pre-installed, they could have shipped Single Executable Application's [0] or used a similar tool for producing binaries.
Or used Deno/Bun's native binary packaging.
[0] - https://nodejs.org/api/single-executable-applications.html
franga2000 2 weeks ago
ramoz 2 weeks ago
It's often parallel processing of I/O (network, filesystem) and computational tasks like testing and compiling code.
yahoozoo 2 weeks ago
tptacek 2 weeks ago
geertj 2 weeks ago
joshka 2 weeks ago
tymscar 2 weeks ago
If codex was half as good as they say it is in the presentation video, surely they could’ve sent a request to the one in chatgpt from their phone while waiting for the bus, and it would’ve addressed your comments…
winrid 2 weeks ago
joshka 2 weeks ago
suddenlybananas 2 weeks ago
joshka 2 weeks ago
apwell23 2 weeks ago
I suspect its not much because I never see any stats published by any of these companies.
NitpickLawyer 2 weeks ago
> Aider writes most of its own code, usually about 70-80% of the new code in each release. These statistics are based on the git commit history of the aider repo.
apwell23 2 weeks ago
interesting model. every prompt response acceptance gets a git commit without human modifications .
quotemstr 2 weeks ago
Likewise, you can make a single file distribution of a TypeScript program just fine. (Bun has built in support even.) But people don't think of it as a "thing" in that ecosystem. It's just not the culture. TypeScript means npm or Electron. That's the equivalence embedded in the industry hive mind.
To be clear, I'm not decrying this equivalence. Simplification is good. We use language as a shorthand for a bundle of choices not necessarily tied to language itself. You can compile Python with Nuitka or even interpret C. But why would you spend time choosing a point on every dimension of technical variation independently when you could just pick a known good "preset" called a language?
The most important and most overlooked part of this language choice bundle is developer culture. Sure, in principle, language choice should be orthogonal to mindset, areas of interest, and kinds of aptitude. But let's be real. It isn't. All communities of humans being and Go developers evolve shared practices, rituals, shibboleths, and priesthoods. Developer communities are no exception.
When you choose, say, Rust, you're not just choosing a language. You're choosing that collection of beliefs and practices common among people who like to use that language. Rust people, for example, care very much about, say, performance and security. TypeScript people might care more about development speed. Your C people are going to care more about ABI stability than either.
Even controlling for talent level, you get different emphases. The Codex people are making a wire format for customizing the agent. C people would probably approach the problem by making a plugin system. TypeScript people would say to just talk to them and they'll ship what you need faster than you can write your extension.
Sometimes you even see multiple clusters of practitioners. Game development and HFT might use the same C++ syntax, but I'd argue they're a world apart and less similar to each other than, say, Java and C# developers are.
That's why language debates get so heated: they're not just expressing a preference. They're going to war for their tribe. Also, nothing pisses a language community off more than someone from a different community appropriating their sacred syntax and defiling it by using it wrong.
Codex isn't so much swapping out syntax as making a bet that Rust cultural practices outcompete TypeScript ones in this niche. I'm excited to see the outcome of this natural experiment.
rgbrgb 2 weeks ago
pjmlp 2 weeks ago
We are in the middle of a transition in programming paradigms.
Let the AI coding models flamewars start.
pjmlp 2 weeks ago
Unfortunately that in an utopia that will never realise.
People will keep learning programming languages based on hearsay, whatever books they find somewhere, influencers and what not.
quotemstr 2 weeks ago
ThouYS 2 weeks ago
2 weeks ago
jbellis 2 weeks ago
jauntywundrkind 2 weeks ago
> Optimized Performance — no runtime garbage collection, resulting in lower memory consumption
Introducing the list (efficiency resonates with me as a more specific form of performance):
> Our goal is to make the software pieces as efficient as possible and there were a few areas we wanted to improve:
jbellis 2 weeks ago
the others ("zero dependencies") are not actually related to efficiency
jauntywundrkind 2 weeks ago
Efficiency is the top level goal, and that equates directly to performance in most computing tasks: more efficiency means being able to let other work happen. There's times where single threaded outright speed is better, but usually in computing we try as hard as possible to get parallelism or concurrency in our approaches such that efficiency can directly translate to overall performance.
Overall performance as a bullet seems clear enough. Yes it's occluded by a mention of GC, but I don't think the team is stupid enough to think GC is the only performance factor win they might get here, even if they don't list other factors.
Even a pretty modest bit of generosity makes me think they're doing what was asked for here. Performance very explicitly is present, and to me, I think quite clearly a clear objective.
simianwords 2 weeks ago
vendiddy 2 weeks ago
CGamesPlay 2 weeks ago
This is interesting, because the current Codex software supports third-party model providers. This makes sense for OpenAI Codex, because is is the underdog compared to Claude Code, but perhaps they have changed their stance on this.
[edit] Seems that this take is incorrect; the source is in the tree.
mappu 2 weeks ago
mdaniel 2 weeks ago
I would bet it took more wall-clock time to type out that comment than it would have for any number of AI agents to snap the required equivalent of `if not re.match(...): continue` into place
quesera 2 weeks ago
// TODO: Verify server name: require `^[a-zA-Z0-9_-]+$`?
There may be several elements of server name verification to perform.That regex does not cover the complete range of possibilities for a server name.
The author apparently decided to punt on the correctness of this low-risk test -- pending additional thought, research, consultation, or simply higher prioritization.
CGamesPlay 2 weeks ago
Reviewing the source for this tree, looks like it's been in public development for a fair amount of time, with many PRs.
xyzzy_plugh 2 weeks ago
justachillguy 2 weeks ago
npalli 2 weeks ago
drodgers 2 weeks ago
Architecting the original 100kloc program well requires skill, but that effort is heavily front loaded.
light_hue_1 2 weeks ago
It's a way to close off codex. There's no point in making a closed source codex if it's in typescript. But there is if it's in rust.
This is just another move to make OpenAI less open.
phito 2 weeks ago
antimora 2 weeks ago
lioeters 2 weeks ago
laurent_du 2 weeks ago
tux3 2 weeks ago
The neat thing for me is just not needing to setup a Node environment. You can copy the native binary and it should run as-is.
Wowfunhappy 2 weeks ago
satvikpendem 2 weeks ago
koakuma-chan 2 weeks ago
yahoozoo 2 weeks ago
littlestymaar 2 weeks ago
wrsh07 2 weeks ago
Also, ideally your lightweight client logic can run on a small device/server with bounded memory usage. If OpenAI spins up a server for each codex query, the size of that server matters (at scale/cost) so shaving off mb of overhead is worthwhile.
qsort 2 weeks ago
The big one is not having node as a dependency. Performance, extensibility, safety, yeah, don't really warrant a rewrite.
mrweasel 2 weeks ago
There shouldn't be a reason why you couldn't and it would give you performance and zero dependency install.
littlestymaar 2 weeks ago
Edit: ah, I see, I read “LLM” instead of LLVM at first! It's only after I posted my question that realized my mistake.
I'm not sure it makes sense to compile JavaScript natively, due to the very dynamic nature of the language, you'd end up with a very slow implementation (the JIT compilers make assumptions to optimize the code and fall back to the slow baseline when the assumptions are broken, but you can't do that with AoT).
mrweasel 2 weeks ago
That's a good point, maybe TypeScript would be a better candidate.
crabmusket 2 weeks ago
For what it would take to compile TS to native code, check out AssemblyScript.
wiseowise 2 weeks ago
Astral folks are taking notes. (I wouldn't be surprised if they already have a super secret branch where they rewrite Python and make it 100x faster, but without AI bullshit like Mojo).
jacob019 2 weeks ago
semiinfinitely 2 weeks ago
winrid 2 weeks ago
rcleveng 2 weeks ago
This needs admin permissions, which means a ticket with IT and a good chance it'll be rejected since it's scary as it'll open up the door to many admin level installs of software that IT has no control over.
Installing node under WSL is a better approach anyway, but that'll make it harder for enterprise customers still.
pjmlp 2 weeks ago
https://nodejs.org/en/download
I never used nvm.
If someone doesn't get this, it is a skill issue.
mahmoudimus 2 weeks ago
Yes, if I spent more time learning these things, it would become simple but that seemed like a massive waste of time.
rileytg 2 weeks ago
mahmoudimus 2 weeks ago
unshavedyak 2 weeks ago
puskuruk 2 weeks ago
karn97 2 weeks ago
wiseowise 2 weeks ago
Hilarious take.
alexchamberlain 2 weeks ago
satvikpendem 2 weeks ago
Usage is in the eye of the user, I see.