「AI programmers」
AnonymousRecently got access to OpenAI's Codex and I played around with it a bit.
Sometimes it generates exactly what I ask, but next day if I ask the same
thing it spits out absolute garbage code like, instead of iterating an
array it literally writes out every single index and does operations
on them one by one. Is it really just copying someone's code from GitHub?
If that's true then it's a huge gamble. I can imagine if I'm working on a
big project I would have to generate and then check every single line
carefully for bugs, at that point I'd rather write the code myself because
it's much easier for me to write code and know exactly how it works
than try to understand what someone else wrote.
Still, I think this is just the beginning. Language models are going
to get better and better. I don't think it will completely replace
programmers until AGI but it will definitely change the field in a major way.
Will this cause decline in demand for programmers? I know the creation of
high level languages did the opposite, which is basically the same thing
with Codex, it's another way of telling a computer what to do
but it's becoming more and more like plain english.
I don't know if this never-ending process of reaching for higher levels of abstraction
will always create more jobs or if it will halt at a certain point.
And this one might as well be that point because it will basically
just require you to know english and that's it. Although there
must be someone who actually understands the code AI generates, you
could see how this significantly reduces the amount of work you have to do.
I guess I'm worried because I'm just entering the field and wondering
if I'll struggle keeping a job in a few years.
What do you think?