62 points by sandwichsphinx 11 hours ago | 62 comments
wmal 10 hours ago
The video shows a private fork of a pubic repository. The bug is real, but it was resolved in February 2023 and doesn’t seem like the solution was automated [1]
The bug has a stack trace attached with a big arrow pointing to line 223 of a backend_compat.py file. A quick grasp on this stack trace and you already know what happened and why, and how to fix this, but…
not for the agent. It seems to analyze the repository in multiple steps and tries to locate the class. Why did they even release this video?
colonwqbang 10 hours ago
wmal 10 hours ago
This way it would at least look like it may work
toomuchtodo 9 hours ago
negoutputeng 9 hours ago
we are at a phase where the early adopters have seen the writing on the wall.. ie that llms are useful for a limited set of usecases. but there are lots of late adopters who are still awestruck and not disillusioned yet.
colonwqbang 9 hours ago
negoutputeng 10 hours ago
so, they organize hackathons where devs build a hypothetical agentic framework nobody will dare use. So, mgmt can claim, look here what i have done to be agentic.
you should ask: would you dogfood your agent, and the answer is no way. these are meant purely for marketing purposes, as they dont meet an end user need.
negoutputeng 10 hours ago
just goes to show, it is all a big song-and-dance. much ado about nothing.
jjmarr 10 hours ago
The term "agent" implies you can give the AI full access to your repos and fire the software engineers you're grudgingly paying six figures to.
The second is much more valuable to executives not wanting to pay the software people that demand higher salaries than virtually everyone else in the organization.
viraptor 9 hours ago
negoutputeng 9 hours ago
i am saying, the thing is snake-oil - a solution looking for a problem.
viraptor 9 hours ago
mooreds 10 hours ago
whiplash451 9 hours ago
viraptor 10 hours ago
guluarte 6 hours ago
BugsJustFindMe 10 hours ago
I made a few minor edits, but I think we all know this is coming. This calls itself "for developers" for now, but really also it's "instead of developers", and at some point the mask will come off.
bloopernova 10 hours ago
throw234234234 5 hours ago
digging 9 hours ago
> LLMs are a tool, nothing more, they don't magically imbue the user with competency.
Not a good take though, IMO. They're literally a tool that can teach you how to use them, or anything else.
RealityVoid 10 hours ago
giantg2 10 hours ago
mycall 10 hours ago
rzzzt 10 hours ago
bun_at_work 10 hours ago
soco 10 hours ago
sesteel 10 hours ago
Workaccount2 9 hours ago
Although I also wonder about the development of new languages that may be optimized for transformers, as it seems clumsy and wasteful to have transformers juggle all the tokens needed to make code readable by humans. That would be really interesting to have a model that outputs code that functions incredibly but is indecipherable by humans.
lwhi 9 hours ago
I don't think junior devs are going to benefit; if anything, the whole role of 'junior' has been made obsolete. The rote / repetitive work a junior would traditionally do, can now be delegated wholesale to a LLM.
I figure, productivity is going to be increased a lot. We'll need less developers as a result. The duties associated with developers are going to morph and become more solutions / architecture orientated.
Workaccount2 9 hours ago
j-krieger 9 hours ago
alkonaut 9 hours ago
I have very little fear for my own job no matter how good models get. What happens is that software gets cheaper and more of it is bought. It’s what happened in every industry with automation.
Those who can’t operate a machine though (in this case an AI) should maybe worry. But chances are their jobs weren’t very secure to begin with.
jcgrillo 10 hours ago
TeslaCoils 10 hours ago