The Software Engineering Identity Crisis

143 points by napolux 3 months ago | 159 comments
  • dandellion 3 months ago
    I've been using AI for code for more than two years already, the auto-completion is a nice help that I'm willing to pay for, but every time I try anything that's harder than the basics it completely falls flat.

    It doesn't surprise me though, most of the people working on this are the same that had been promising self-driving cars. But that proved to be quite hard, and most of them moved on to the next thing which is this. So maybe a decade from now we'll be directing AIs instead of writing code. Or maybe that will also be difficult, and people will have moved on to the next thing they will fail to replace with AI.

    • torginus 3 months ago
      This. All these reasoning models that push out complete modules of code tend to not write code I would have - and have difficulty writing code that matches up with the rest of the codebase. And even if it does, the burden of understanding what the AI does falls onto my shoulders, and as everyone knows, understanding someone elses code is 10x harder than writing your own.
      • clown_strike 3 months ago
        > And even if it does, the burden of understanding what the AI does falls onto my shoulders, and as everyone knows, understanding someone elses code is 10x harder than writing your own.

        Which means we're doing it backwards.

        AI writes code as well as a committee of StackOverflow users but reads it without breaking a sweat. Use AI to do the harder part.

      • margalabargala 3 months ago
        Even in the best case scenario for LLMs, they aren't mind readers. They'll be time savers. They're more like compilers for language to code, like how actual compilers transform code to assembly.

        The job has changed before, it will change again. It may get easier to enter the industry, but the industry will still exist and will still need subject matter experts a decade from now.

        • dgellow 3 months ago
          I don’t feel LLMs are the good tool for that. It’s nice to have something that behaves as if it understood my requests and business model, but we need reproducibility, otherwise it’s too unpredictable.

          I also don’t like English as a language to express requirements, it’s not strict enough and depends too much on an implicit context. Whatever high-level abstraction we end up with it cannot be something that results in the wrong implementation because the agent incorrectly read the tone of the exchange.

          • lunarboy 3 months ago
            That's why code was invented in the first place no? But now LLMs are in the ballpark where lay people can describe something vaguely and get a working MVP. Whether you can ship, scale, and debug that code is a completely different question
            • kristianc 3 months ago
              > Whatever high-level abstraction we end up with it cannot be something that results in the wrong implementation because the agent incorrectly read the tone of the exchange.

              That can easily be said of most Product Manager > SWE relationships too though

            • AbstractH24 3 months ago
              > Even in the best case scenario for LLMs, they aren't mind readers. They'll be time savers.

              In otherwords, they are cheaper mid to entry level employees which don’t get sick or have emotions. I think most people would agree with this.

              One of the reasons well informed curious people tend to underestimate the value of LLMs despite their flaws is they underestimate the amount of routine work that could be automated but is still done imperfectly by humans. LLMs lower the barrier to entry to halfway decent automation and eliminating those jobs.

              Where I’m not sold yet is on the whole idea of bots that go off and do their own thing totally unsupervised (but increasingly, you are having models supervise one another).

            • ddoolin 3 months ago
              Honestly, I don't know why nobody says it, but I just _don't want to._ I don't want to use it too much. It's not that I'm paranoid about it outputting bad code, but I just like doing almost all of it myself. It helps that I will do better than it 100% of the time but that isn't really why I don't use it. If it's going to replace all of us, fine. I guess you can chalk me up as not being into this hyper-productivity mindset. I just want to write code, period. I use it in the same manner as I see most comments saying; that is, as a fancy code complete, but I haven't found myself wishing "if only this could do all of it for me!"
              • danielbln 3 months ago
                I'll provide a counter perspective: I love building things, but the minutae and implementation details of code are merely a means to an end for me, something that stands in between me and a feature/experiment. Agentic coding, especially when doing green field or prototyping takes the grunt work away and let's me build on a higher abstraction level. And I love it, though I can see that that's not for everyone.
                • ddoolin 3 months ago
                  Totally get it. And there has been work for me where I would've offloaded the details like that for sure. It might be that it's when I really feel like the project is my baby and I'm really enjoying that part of the process.
                • CharlieDigital 3 months ago
                  I think a good analogy is 3D printing. Certainly, it's faster from idea to prototype, but the actual act of carving wood or molding clay is itself a creative process that for some will never be replaced because we have cheap commodity 3D printers.
                  • elliottkember 3 months ago
                    3D printing is an interesting analogy. When I got my printer I really truly thought I’d be printing all sorts of things to use around the house and gadgets and stuff.

                    It turns out that to be a good 3D printer, you need to be really good at CAD, and measuring stuff with Vernier calipers. That’s like prompt engineering.

                    Then there was the nozzle temperature, print errors, and other strange results — call those hallucinations.

                    Once I had designed something that I needed many instances of, it was great. But for one-offs, it was a lot of work. So it goes with AI.

                  • SJC_Hacker 3 months ago
                    I've found it helpful eliminating alot of boilerplate, especially for languages with static typing. e.g. if I declare something like class Foo { int a; int b; public: void Foo() } it will normally start autocompleting the initializer.
                  • AaronAPU 3 months ago
                    Every time someone says this, when I ask they haven’t used o1-pro. Not saying that’s the case with you but I have to ask.

                    In my experience it’s literally the only model which can actually code beyond auto-complete. Not perfect but a completely different tier above the rest.

                    • 9dev 3 months ago
                      People have been saying that since the start, too, albeit about different models. It never felt revolutionary; the moment I asked about a particularly gnarly recursive generic type problem, or something that requires insights from across the code base, it was just rubbish from all models. Good to finish the line I wanted to write, bad at creating software.
                      • FirmwareBurner 3 months ago
                        >People have been saying that since the start

                        The progress made since the start has been wild, and if it keeps increasing, even at a much slower pace, it's gonna be even better.

                        That's like people looking at N64 games saying "wow, these new 3D graphics sure look like ass, they'll never catch on and replace 2D games". Or like people looking at the output of early C compilers going "wow, this is so unoptimized, I'll stick to coding in assembly for my career since nobody will ever use compilers for serious work".

                        It boggles my mind how ignorant people can be about progress and disruption based on how past history played out. Oh well, at least more power to those who embrace the new tech early on.

                        • AaronAPU 3 months ago
                          I’ll put you down as a “No, I haven’t used o1-pro”
                        • J_Shelby_J 3 months ago
                          O1-pro is the first time I’m actually hesitant to recommend an AI tool. Out of selfishness perhaps. It’s the true kick off of the AI career Cold War.
                          • 3 months ago
                          • NBJack 3 months ago
                            > Or maybe that will also be difficult, and people will have moved on to the next thing they will fail to replace with AI.

                            Probably quantum computing. That seems to be the next hyped up product.

                            • AbstractH24 3 months ago
                              Hasn’t been discussed for ages?
                            • m463 3 months ago
                              > anything that's harder than the basics it completely falls flat

                              I kind of wonder if there's a way to make ai-accessible software.

                              For example, lets say someone wrote some really descriptive tutorial on blender, not only simple features, but advanced ones. added some college texts on adjacent problems to help prevent "falling flat" at more difficult tasks.

                              could something like that work? I figure LLMs are now just reading simple tutorials now, what about feeding them advanced stuff?

                              • justanotherjoe 3 months ago
                                What you said struck me:

                                > It doesn't surprise me though, most of the people working on this are the same that had been promising self-driving cars.

                                Where did you pick that up? That's just pure hallucination (ironically). Also what are you implying, that Tesla made LLM now? If I have to guess their amount of contribution to LLM it'd be way closer to 0 than 'most'.

                                It's one of those things that is so easy to say but has no actual truth.

                                • aerokr 3 months ago
                                  Self driving cars exist and are safer than human drivers, Waymo being the obvious answer - https://waymo.com/research/do-autonomous-vehicles-outperform.... Replacement is a matter of large scale deployment and coordination with legal frameworks, which isn’t the same problem as self driving cars.
                                  • carlmr 3 months ago
                                    Has Waymon figured out how to scale this yet? Last time I checked they needed highly precise and up to date maps that are not just available for every location, and thus they're limited to small test regions.
                                    • mdaniel 3 months ago
                                      > highly precise and up to date maps

                                      I would guess that's one of those "pick any two" things, given how many construction projects, repair, "life happens" stuff goes on in a modern city

                                      That said, unless I'm totally missing something they have a self-solving problem with that since the Waymo's all carry around cameras, lidar, and presumably radar so I would expect that they update the maps as they go. Come to think of it, that's very likely why I originally saw them roaming around the city with drivers in them: testing the pathfinding and mapping at the same time

                                  • greenie_beans 3 months ago
                                    meanwhile i'm building entire features with it, and they work without bugs and i understand all the code.
                                    • sceptic123 3 months ago
                                      In what language? On what size of code base?
                                      • greenie_beans 3 months ago
                                        python, django, react, javascript, vanilla js, html, css, tailwind, htmx, bash, etc, etc, etc
                                  • meander_water 3 months ago
                                    This reduces the field of software engineering to simple code generation when it is much more than that.

                                    Things like system design thinking and architectural design are not solely tasks performed by managers or specialised roles.

                                    Software developers need to wear multiple hats to deliver a solution. Sure, building out new features or products from scratch often get the most glory an attention. But IMO, humans still have the edge when it comes to debugging, refactoring and optimisation. In my experience, we beat AI in these problems because we can hold the entire problem space/context in our brains,and reason about it. In contrast, AI is simply pattern matching, and sure it can do a great job, but only stochastically so.

                                    • leoedin 3 months ago
                                      The foundation of maintainable software is architecture. I can't be alone in having often spent days puzzling over a seemingly highly complex problem before finally finding a set of abstractions that makes it simple and highly testable.

                                      LLMs are effectively optimisation algorithms. They can find the local minima, but asking them to radically change the structure of something to find a much simpler solution is not yet possible.

                                      I'm actually pretty excited about LLMs getting better at coding, because in most jobs I've been in the limiting factor has always been rate of development rather than rate of idea production. If LLMs can take a software architecture diagram and fill in all the boxes, that would mean we could test our assumptions much quicker.

                                      • Centigonal 3 months ago
                                        Yes, this is how I feel as well. I'm not going to use an LLM to create my architecture for me (though I may use it for advice), because I think of that as the core creative thing that I bring into the project, and the thing I need to fully understand in order to steer it in the right direction.

                                        The AI is great at doing all the implementation grunt work ("how do I format that timestamp again?" "What's a faster vectorized way to do this weird polars transformation?" "Can you write tests to catch regressions for these 5 edge cases which I will then verify?").

                                        • skydhash 3 months ago
                                          Almost everytime I read about someone finding LLMs useful for a programming task, the description of how the LLMs are used sounds like either the person is missing domain knowledge, don't use a capable editor, or are not familiar with reading docs.

                                          When I find myself missing domain knowledge, my first action is to seek it. Not to try random things that may have hidden edge cases that I can't foresee. The semantics of every line and every symbol should be clear to me. And I should be able to go in details about its significance and usage.

                                          Editing code shouldn't be a bottleneck. In The Pragmatic Programmer, one of the advice is to achieve editor fluency. And even Bram has written about this[0]. Code is very repetitive, and your editor should assist you in reducing the amount of boilerplate you write and navigating around the codebase. Why? Because that will help you prune the code and get it in better shape as code is a liability. Generating code is a step in the wrong direction.

                                          There can be bad docs, or the information you're seeking is not easily retrievable. But most is actually quite decent, and in the worst case, you have the source code (or should). But there are different kind of docs and when someone is complaining about them, it's usually because they need a tutorial or a guide to learn the concepts and usage. Most systems assume you have the prerequisites and will only have the reference.

                                          [0]: https://www.moolenaar.net/habits.html

                                      • slt2021 3 months ago
                                        a lot of value of software engineers is talking to users (end users, clients etc)
                                      • bsder 3 months ago
                                        Want to convince me of AI coding? Let's see AI go modernize the old X11 codebase. Wayland progress is so glacially slow that a motivated programmer should be able to run rings around them with AI on X11, right? Show me that, and I'll pay attention to "AI".

                                        > Many of us don’t just write code - we love writing code.

                                        Excuse me, I HATE writing code.

                                        Code is the thing that is in the way between what I want to computer to do and the computer doing it. If AI reduces that burden, I would be the first to jump on that wave.

                                        GUI programming still sucks. GPU programming still sucks. Embedded programming still sucks. Concurrent programming still sucks. I can go on and on.

                                        I was actually having this discussion with somebody the other day that 99% of my programming is "Shaving Yaks" and 1% actually focused on the problem I want to solve.

                                        When AI starts shaving the yaks for me, I'll start getting excited.

                                        • chickenzzzzu 3 months ago
                                          I would like to say this as unambiguously as possible. You either have a skill issue, or you are deliberately solving problems other than the thing you actually want to solve, which you do not mention.
                                          • 9dev 3 months ago
                                            I think he’s got a point actually. How many times has a compiler told you about a missing semicolon ”here“, and you’ve never thought, ”well if you’re so clever, why don’t you just solve it?!“

                                            Code is an awful abstraction to capture chains of thoughts, but it’s the best we’ve got. Still, caring about syntax, application architecture, concurrency, memory layout, type casting, …—all of that is just busywork, not making the robot go beep.

                                            • mdaniel 3 months ago
                                              ed: or did you omit a "not" as in "not just busywork"?

                                              > application architecture,

                                              I think you got carried away with your analogy there, because I can assure you that if the LLM generates kafkaClient.sendMessage everywhere for latency sensitive apps, that's not gonna go well, or similar for httpClient.post for high throughput cases

                                            • blacksoil 3 months ago
                                              > You either have a skill issue, or you are deliberately solving problems other than the thing you actually want to solve, which you do not mention.

                                              Wow the first one is quite judgmental. FWIW, I'm the same way, I like writing some code, but definitely not all code. Writing CRUD boilerplate for a schema? No thank you, AI is more than welcome to take that.

                                              I derive my joy from getting something done, making it actually usable by the business side, and ultimately have it to generate revenue.

                                            • cheevly 3 months ago
                                              Let me see you build an airplane with bricks and mortar, only then will I be excited by the power of flight.
                                              • Werewolf255 3 months ago
                                                "Look, you're being a pessimist. Yes, over half of the airplanes we make kill everyone on board.

                                                But you're really NOT focused on the planes that kill only 75% of the crew and passengers. Just think! In ten years, we'll know how to build a plane!

                                                Please stop telling me that we already have designs for planes that work. I don't want to hear that anymore."

                                                • Apocryphon 3 months ago
                                                  You’re saying X11 is that bad?
                                              • hnthrow90348765 3 months ago
                                                My guess is that job requirements will grow even larger, so it will be better for people who like jumping around between front-end, back-end, infrastructure, database, product, support, testing, and management duties. You'll have to resist any uncomfortable feelings of not being good at any one thing, much less mastering it. Naturally, they won't ask non-technical staff and managers to suddenly become technical and learn to code with AI.

                                                In the grander scheme of things, what matters is if their products can still sell with AI coders making it. If not, then companies will have to pivot back to finding quality - similar to offshoring to the cheapest, getting terrible developers (not always) and a terrible product, then having to rehire the team again.

                                                If the products do sell with AI coders, then you have to reckon with a field that doesn't care about quality or craftsmanship and decide if you can work like that, day-in-day-out.

                                                • rreichman 3 months ago
                                                  I think we can expect a bifurcation: managerial jobs that will require a lot of breadth and engineering jobs that will require a lot of depth. The manager engineers will have AIs doing all sorts of things for them across the stack. The deep engineers will develop an expertise that the AI can't get to (at least not yet).
                                                  • JaDogg 3 months ago
                                                    Yes this is what I call "the accountant/spreadsheet theory", and I think this is the most likely scenario.
                                                    • lunarboy 3 months ago
                                                      I agree this is where things seem to be going in the 5-10year frame. Spinning wheel didn't obsolete weavers completely, it just allowed for more workers and more throughput at less skill. I think entry junior devs will be out of jobs, but unless these AIs can start coming up with coherent high level designs, higher level architects seem to be okay in that time frame at least
                                                      • mistrial9 3 months ago
                                                        architectural design was very well paid for a long time, for many individuals. In modern USA, there is almost no way a person could be an architect for a living -- there is no career path. Employers in finance and other core business are already bragging that eighty percent of coding will be AI. Executives want to fire coders, and lower the wages for coders, and have complete control over output of coders. AI is being sold for that today.
                                                    • roxolotl 3 months ago
                                                      The secret is that looking to work as a way to fulfill that desire to build and create is not a good idea. The existence of industrial farming takes no joy from my backyard garden. My usage of Gen AI doesn’t diminish the wonder I feel building projects at home.

                                                      Looking to corporate work as an out for your creative desires never really worked out. Sure there was a brief golden age where if you worked at a big tech company you could find it but the vast majority of engineers do utilitarian work. As a software engineer your job as always been to drive business value.

                                                      • v3xro 3 months ago
                                                        It doesn't have to be that way no? As soon as we start realigning economic systems to value labor more than capital again I think we will find meaningful pursuits in all business areas.

                                                        Edit: and yes, I am all too aware of the "market can remain irrational longer than you can remain solvent" adage as applied to this situation.

                                                        • roxolotl 3 months ago
                                                          Oh yea absolutely. But like you’re saying in your edit it might take a long time to get there. I think also in order to get there we have to acknowledge where we are.
                                                        • m3t4man 3 months ago
                                                          It's also not hard to understand why people seek that kind of fulfillment at work. It is something we dedicate most of our day to for most of the week
                                                        • userbinator 3 months ago
                                                          Those who think AI can generate code better than they can, are quite frankly below-average. It's the equivalent of using Google Translate to read and write another language --- and programming languages really do need to be learned as languages to make the most of them. It "works", but the result can never be above average.

                                                          or systems where performance and reliability are paramount

                                                          Since when has that not been the case? Neglecting or even actively avoiding performance and reliability is why almost all new software is just mediocre at best, and the industry is on a decline. AI is only going to accelerate that.

                                                          • allenu 3 months ago
                                                            I agree that it's possible AI is unable to generate exceptional code (at least not at the moment), but there are definitely places where average or below-average may just be good enough.

                                                            If the goal is deliver business value, an argument can be made that one could leverage AI for bits of code where high skill isn't required, and that that could free up the human developer to focus on places where high skill is more important (high level system architecture, data model designs, simpler user experiences that reduce the amount of work overall).

                                                            If a dev just used pure "vibe coding" to generate code and didn't provide enough human oversight to verify the high-level designs, then you can definitely get into an issue where the code gets out of control, but I think there's a middle ground where you have a hybrid of high-level human design and oversight and low-level AI implementation.

                                                            I think the line between how much human involvement there is versus pure AI coding may be a sliding one. For something like a startup that is unsure if their product is even providing enough user value, it might make sense to quickly prototype with AI to see if a product is viable, then if it is, rewrite parts with more human intervention to scale up.

                                                            • ldjkfkdsjnv 3 months ago
                                                              The whole concept of "good code" is built around maintainability for humans. If you remove that requirement, it opens up a whole new definition of what it means to build good software
                                                              • sublinear 3 months ago
                                                                Not true at all. Good code is concise and does what it's supposed to do without any weird side effects or artifacts. That has nothing to do with "maintainability for humans".
                                                                • 9dev 3 months ago
                                                                  Why would you want to avoid side effects, other than making it less hard for the next developer working on the code to understand how it works? Nature uses side effects all the time, and is widely considered to work pretty well.
                                                                • klooney 3 months ago
                                                                  LLMs are even more limited than humans though- tiny context windows, can't really learn new things- which really raises the bar on writing APIs that are footgun free.
                                                                  • danielbln 3 months ago
                                                                    Context windows have grown significantly, the biggest one at are at 2 million tokens right now. That plenty enough, as even in his t codebase you don't need to feed the fill codebase in, you just need to provide the LLM a map of e.g. functions and where to find them, and the LLM can pull the relevant files itself as it plans the implementation path. And for that the current context window is plenty.
                                                                    • sumedh 3 months ago
                                                                      > LLMs are even more limited than humans though- tiny context windows,

                                                                      For now.

                                                                    • techpineapple 3 months ago
                                                                      Maybe the whole concept of "clean code" is built around maintainability for humans, but I think there's a version of organization of "good code" that makes it much easier to avoid like nested loop foot guns.
                                                                      • datadrivenangel 3 months ago
                                                                        Using LLMs as compilers is unwise. Ultimately software must be configured or written by humans, even if there are layers and layers of software in between us and 'the code', and that act of configuration/writing is programming.
                                                                        • ldjkfkdsjnv 3 months ago
                                                                          I think alot of the fastest moving and innovative companies right now are generating most code with ai. the ones who ignore this will be left behind
                                                                          • pjmlp 3 months ago
                                                                            That was the same argument folks used against high level languages when Assembly was king.

                                                                            It will come, even if we are a couple of years away.

                                                                          • jimbokun 3 months ago
                                                                            At that point “building software” goes away as a distinct activity. Everything is a conversation with an AI, that builds software to answer questions or perform tasks you request as needed.
                                                                            • lurking_swe 3 months ago
                                                                              you forgot the pesky “security” aspect of system :)
                                                                          • kittikitti 3 months ago
                                                                            Most big tech engineers I know hate coding and see abandoning coding as a progression in their career. Personally, I don't care what the industry thinks, I like coding and do it anyway with or without a huge corporation providing me with every last resource required. If I'm unemployed because coding as a skill is no longer needed, I will still do it just not part of my day job. Just as painters were worried about color printers, the value of the Mona Lisa was put into question. I'm no artist, but I'm not believing the overhyped scenario's. I love language models and their abilities but there's too many HR reps drooling at the mouth thinking they can replace coders with AI.
                                                                            • 18172828286177 3 months ago
                                                                              I’ve been meaning to write essentially this article for a while now.

                                                                              I’m currently prepping for some upcoming interviews, which is involving quite a bit of deep digging into some technical subjects. I’m enjoying it, but part of it feels… pointless. ChatGPT can answer better than I can about the things I’m learning. It is detracting quite a bit from my joy, which would not have been the case 5 years ago

                                                                              • sepositus 3 months ago
                                                                                More than ever modern interviews are pointless. I just finished an SRE technical interview where I had to efficiently solve a problem around maximizing profit in a theoretical market setting. I’m guessing it was just a reframed leet code question. Yet not moments earlier they were talking about their needs with increasing visibility, improving deployment times, etc. At some point this has to break, right? If the article indicates anything, it’s exactly those high level analytical skills they should be testing. I almost think allowing AI would be even better because it allows a conversation about what it got wrong, where it can be improved or is not applicable, etc.
                                                                                • strict9 3 months ago
                                                                                  What you describe has echoes of allowing api lookups and such during an interview. Or if it's ide or repl only, or something like whiteboarding.

                                                                                  You see the process and the questions they ask and evaluate how they distill the responses. The more you let the candidate use sources, the closer it is to day-to-day work.

                                                                                  A somewhat similar equivalent to yesterday's copying and pasting a Stack Overflow or w3 schools solution is blindly copying and pasting a chat response from a quick and vague prompt.

                                                                                  But someone who knows how to precisely prompt or use the correct set of templates is someone with more critical thinking skills that knows when to push back or modify the suggested solution.

                                                                                  Knowing the small % of difference can make a big difference long term in code readability, reliability, and security.

                                                                                  The other big alternative to all of this is strict debugging. A debugging test or chain of thought around quickly identifying and fixing the source of problems. This is a skill whose needs will probably increase over time.

                                                                                  • sepositus 3 months ago
                                                                                    I suspect a large part of the problem is a lack of experienced engineers with the capacity to do interviews. As someone who often gets stuck with them, I can attest to how draining they can be, especially if you're doing them correctly. The problem is that it's _these_ engineers we need running the interview because they can pretty quickly pick out a fraudulent candidate from an exceptional one while giving both the option to use AI. I generally only need about 30 minutes to confidently assess whether someone is worth pushing further in the process.

                                                                                    But while HR would love to make me a full-time interviewer, it rarely makes business sense. So we end up with unqualified people using what they were told are good signals for hiring talent.

                                                                                  • techpineapple 3 months ago
                                                                                    I wonder if software engineers should take something like a modified LSAT. I do think that testing the ability to do some basic coding logic problem does get at the ability to break apart and understand the kind of problems one faces when turning requirements into business logic. I may be the only person in the world that doesn't really see a problem with the way we interview folks, when done skillfully.
                                                                                  • nyarlathotep_ 3 months ago
                                                                                    Exactly.

                                                                                    I'm far less motivated to learn technical topics now than I was even two years ago. I used to crack books/articles open pretty frequently largely for personal reasons but much of my motivation to do so has been removed by the presence of LLMs.

                                                                                    • braebo 3 months ago
                                                                                      This is the part that saddens me the most. Watching that spark of curiosity and intrigue dwindle from the hearts and minds of nerds like you and me. I noticed it start to happen to me back in November when I truly began to understand where we were headed.

                                                                                      I think I occupy a sweet spot now with my current skill set - AI can’t solve the problems I work on - but it can really help empower my workflow in small doses. Nevertheless, it’s a moving target with lots of uncertainty, and the atrophy of my skill and passion is palpable even in the past 6 months.

                                                                                      • nyarlathotep_ 3 months ago
                                                                                        Yeah you nailed it. Glad to hear I'm not alone.

                                                                                        I used to pay for O'Reilly, I have a pile of software/programming/computer-related books, and meh I just don't care for it like I did.

                                                                                        I'd page through stuff at night, and the last year especially have really de-motivated me.

                                                                                  • TeMPOraL 3 months ago
                                                                                    Some good points, but I feel that by the end, the article lost track of an important angle. Quoting from the ending:

                                                                                    > And now we come full circle: AI isn’t taking our jobs; it’s giving us a chance to reclaim those broader aspects of our role that we gave away to specialists. To return to a time when software engineering meant more than just writing code. When it meant understanding the whole problem space, from user needs to business impact, from system design to operational excellence.

                                                                                    Well, I for one never cared about business impact in general sense, nor did I consider it part of the problem space. Obviously, minding the business impact is critical at work. But if we're talking about identity, then it never was a part of mine - and I believe the same is true about many software engineers in my cohort.

                                                                                    I picked up coding because I wanted to build things (games, at first). Build things, not sell things.

                                                                                    This mirrors a common blind spot I regularly see in some articles and comments on HN (which perhaps is just because of its adjacency to startup culture) - doing stuff and running a company that does stuff are entirely different things. I want to be a builder - I don't want to be a founder. Nor I want to be a manager of builders.

                                                                                    So, for those of us with slightly narrower sense of identity as software engineers, the AI thing is both fascinating and disconcerting.

                                                                                    • spo81rty 3 months ago
                                                                                      I think it comes down to ownership. Going forward it will be more important for engineers to show more product ownership of their domain. Product thinking is becoming more important.

                                                                                      That doesn't mean you are a salesperson. It means you are more connected to the users and their problems.

                                                                                      • TeMPOraL 3 months ago
                                                                                        Except you don't actually own any of it. Ownership belongs to your employer. The only thing you own, is responsibility.
                                                                                      • jimbokun 3 months ago
                                                                                        Well take out the two words “business impact” and the rest still applies to you.
                                                                                        • TeMPOraL 3 months ago
                                                                                          Sure, but these two words are important. They're placing the whole into a different category.

                                                                                          It's kind of like me saying, "I'm not a soldier - being a soldier means exercising a lot, following orders, and occasionally killing people", and you replying, "well take out the two words 'killing people' and the rest still applies to you".

                                                                                      • aforwardslash 3 months ago
                                                                                        Right now, being able to efficiently extract value from AI based code generation tools requires supervision at some extent - e.g. A competent developer that is able to validade the output; As the industry moves toward these systems, so do the hiring requirements - there is little to no incentive to hire more junior devs (as they lack the experience of building software), effectively killing the entry-level jobs that would someday generate those competent developers.

                                                                                        The thing is, part of the reason AI requires supervision is because it's producing human-maintainable output in languages oriented for human generation; It is my belief we're at a reality akin to the surgence of the first programming langages - a mimick that allows humans to abstract away machine details and translate high-level concepts into machine language. It is also my belief that the next step are specialized AI languages, that will remove most of the human element, as an optimization. Sure, there will always the need for meatbags, but big companies will hire tens, not thousands.

                                                                                        • 1shooner 3 months ago
                                                                                          I heard an OpenAI engineer give an eye-opening perspective that went something like: anything above machine code is an abstraction for the benefit of the humans that need to maintain it. If you don't need humans to understand it, you don't need your functionality in higher-level languages at all. The AI will just brute-force those abstractions.
                                                                                          • soulofmischief 3 months ago
                                                                                            I don't buy into it. The benefits of abstraction hold for machines, as they can spend less bandwidth when modeling, using and modifying a system, and error is minimized during long, repetitive operations.

                                                                                            Abstraction can be thought of as a system of interfaces. The right abstractions can totally transform how a human or machine interpret and solve a problem. The most effective and elegant machines will still make use of abstraction above machine code.

                                                                                            • aforwardslash 3 months ago
                                                                                              Many modern compilers already do this - they generate intermediate code, and then a specialized binary is generated. Why couldn't an LLM generate intermediate code directly?
                                                                                            • skydhash 3 months ago
                                                                                              Abstractions are patterns that leads to reusable solutions. So you don't need to write the same code again and again where you can use a simple symbol or construct to manipulate. It leads to easier understanding, yes, but also to re-usability.
                                                                                              • aforwardslash 3 months ago
                                                                                                Well, no. One thing are language abstractions, another thing are algorithmic abstractions; language abstractions exist for human convenience, algorithmic abstractions are... Algorithms. Consider this: can you implement a hash table in your favourite language? If true,can you implement it in assembly for your favourite architecture? And if true, can you binary code it on a limited architecture? If you stopped on the first yes, an LLM is aready more advanced.
                                                                                              • techpineapple 3 months ago
                                                                                                This is an interest observation, and seems true for future versions of AI, but when the current technology is based on human language, I don't think I would make the assumption that LLMs would directly translate to manipulating machine language.
                                                                                                • aforwardslash 3 months ago
                                                                                                  While it is true language is an important input, there are plenty of examples of transformer architecture working on generating binary data - a good example would be generating a 3d mesh from a text prompt (you have a shit ton of models for this). The input is text, the output is a binary file - a jpeg, a png or an stl. We're already there, just not for applications.

                                                                                                  Fyi you can do this today. Ask for a flying cape dog and have a stl you can print into a physical object.

                                                                                                • Tteriffic 3 months ago
                                                                                                  True, it’s all machine language in the end. But could you imagine brute forcing UI elements and all, every single time? Maybe eventually.
                                                                                                  • 1shooner 3 months ago
                                                                                                    It's funny you mention UI, because for me this is the first place I'm experiencing this, in web development: LLMs are better at the 'brute force' nature of e.g. Tailwind (inline UI styling) vs. needing to handle the abstractions of indirectly-applicable CSS classes. That's a really small, high level example, but to me it demonstrated how less abstraction actually makes prediction easier for the LLM.
                                                                                                  • aforwardslash 3 months ago
                                                                                                    Short summary, this.
                                                                                                  • cma 3 months ago
                                                                                                    Alternatively, junior devs with ai web search and ai explanations are able to learn much faster and not bother senior devs with compiler error puzzles.
                                                                                                    • n_ary 3 months ago
                                                                                                      But if those aspiring juniors stop asking those questions and the seniors not answer them in open web, neither LLMs nor juniors have new training data and both become stagnated. How do we solve this?

                                                                                                      At my early days, I learned more reading code from much senior engineers and began to appreciate it. An effective seasoned senior writes a beautiful poetry that conveys deeper meaning and solves the entertainment(erm… business requirement) purpose. If the seniors retire and the juniors are no longer hired, then where do LLMs get new data from?

                                                                                                      In all sense of things, I suspect we’ll see more juniors getting hired in coming years and few seniors present to guide them, same as how we previously had few db specialists and architects giving out the outline and the followers made those into actual products.

                                                                                                      • aforwardslash 3 months ago
                                                                                                        The way forward I see is a path to autonomous development; human development is boutique dev or supervision; a good example are x86-64 assembly skills today. The gross of dev will be handled by AI, specially the mindless crud stuff with browser integration. When you skip the whole intermediate language and/or can generate a high level binary definition spec that can be compiled, its game over. Development as we know it is dead, and people will target intermediate ai-solvable code, as they do today to satisfy various pipelines such as testing or cyclomatic complexity.
                                                                                                      • aforwardslash 3 months ago
                                                                                                        Why should a company pay X for a guy that will take 10 years to understand how apps fail, when he can just hire a guy at X2 or X3? Specially if in 3-5 years, you can skip the guys altogether?
                                                                                                        • hooverd 3 months ago
                                                                                                          > are able to but they won't. they'll just give up if the auto-complete doesn't fix it for them.
                                                                                                        • lurking_swe 3 months ago
                                                                                                          so what happens when the AI makes a mistake, how do you debug it or fix it if it’s not in an abstraction useful to a human?
                                                                                                        • ookblah 3 months ago
                                                                                                          Articles like this honestly confuse me. I really do not understand this sentiment that that some coders have where it feels like every line they write is like some finely chiseled piece of wood on a sculpture they made.

                                                                                                          Since day one I've always liked to build things much like the author, my first line of HTML to CSS, frameworks, backend, frontend, devops, what have you. All of it a learning experience to see something created out of nothing. The issue has always been my fingers don't move fast enough; I'm just one person. My mind can't sustain extended output that long.

                                                                                                          My experience with AI has been incredibly transforming. I can prototype new ideas and directions in literally minutes instead of writing or setting up boilerplate over and over. I can feed it garbage and have it give me ballpark insights. I can use it as a sounding board to draw some direction or try to see a diff angle I'm not seeing.

                                                                                                          Or maybe it's just the way that some people use AI coding? Like it's some magic box and if you use it you wont' understand anything or it's going to produce some gibberish? Like a form bikeshedding where people hold the "way" they code as some kind of sacrosant belief. I still review near every line of code and if something is confusing I either figure it out and rewrite it or just comment on what it's doing if it's not critical.

                                                                                                          • 3 months ago
                                                                                                            • bencornia 3 months ago
                                                                                                              Whew! This pretty much sums up how I have been feeling about my career the last 6 months. Today, at work, we had a tongue-in-cheek "vibe coding" day where we practiced building primarily with AI tools. A part of me feels like this isn't what I signed up for when I became a software engineer. I am a builder not a manager. Yet, I was flabbergasting at how much I was able to build with claude. But building doesn't exactly describe what I did. And as the article suggests it does feel like managing and not building. I didn't read every line of code that was generated. I think describing the current zeitgeist as an identity crisis is spot on. Software engineering is going through a fundamental shift. But that has always been the case. The field has drastically evolved since it's inception. The only difference is that rate of change has increased such that it feels like we are experiencing a titanic shift. It certainly is an exciting time!
                                                                                                              • cadamsdotcom 3 months ago
                                                                                                                It’s great that the typing part is being reduced - as is looking up APIs and debugging stupid issues that wreck your estimates by wasting your work-day!

                                                                                                                You are still in charge and you still need to read the code, understand it, make sure it’s factored properly, make sure there’s nothing extraneous..

                                                                                                                But ultimately it’s when you demo the thing you built (with or without AI help) and when a real human gets it in their hands, that the real reward begins.

                                                                                                                In the future that’s coming, non-engineers will be more and more able to make their own software and iterate it based on their own domain expertise, no formally educated software engineers in sight. This will be outrage fuel for old-mindset software engineers to take to their blogs and shake their metaphorical walking-sticks at the young upstarts. Meanwhile those who move with the times will find joy in helping non-engineer domain experts to get started, and find joy once again in helping their non-engineer compatriots solve the tricky stuff.

                                                                                                                Mark my words, move with the times people. It’s happening with or without you.

                                                                                                                • kylehotchkiss 3 months ago
                                                                                                                  Anecdotally I perceive there’s been less open source JS libraries/frameworks being released lately. I generally keep an eye on JS/node weekly newsletters and nothing has seemed interesting to me lately.

                                                                                                                  Of course that could be my info bubble (Bluesky instead of twitter, newsletters, slightly less attracted to shiny than I used to be)

                                                                                                                  Anybody else feel the same?

                                                                                                                  • georgemcbay 3 months ago
                                                                                                                    My hobby project languages of choice these days are Kotlin multiplatform and Go, not JS, but there are multiple things I've worked on over the past year that I would have open sourced in the past but won't now because I'm not interested in freely helping with the big LLM slurp.

                                                                                                                    I'm not ideologically against LLMs as a technology or the usage of them, but I do believe there is an inherent unfairness in the way a small set of companies have freely hoovered up all of this work meant to enhance the public commons (often using the most toxic and anti-social web crawling robots imaginable) while handwaving away what I believe are important copyright considerations.

                                                                                                                    I'd rather just not release my own source code anymore than help them continue to do that.

                                                                                                                    • aforwardslash 3 months ago
                                                                                                                      Its funny you assume source code is needed to infer behaviour of a given application, or to be a valid training item for an AI.
                                                                                                                    • fizx 3 months ago
                                                                                                                      AI is killing e.g. React Server Components and Svelte, but for different reasons.

                                                                                                                      Vibe coding doesn't care about the implementation details, so Svelte is dead.

                                                                                                                      A unified codebase might be better for humans. But a FE/BE database is better for AI, because you have a clear security boundary, separation of concerns, and well-known patterns for both individually.

                                                                                                                      • ldjkfkdsjnv 3 months ago
                                                                                                                        Disagree. Unified frameworks are better for AI, you can cohesively build a feature with the single output of the AI and it has an easier time integrating the two.
                                                                                                                        • fizx 3 months ago
                                                                                                                          Hypothetically, yes. Practically, after having built ~10 small apps in a unified framework with Cursor and Claude Code, it doesn't seem true with today's AI.
                                                                                                                    • porridgeraisin 3 months ago
                                                                                                                      It's a useful assistant. Never again do I need to argparse each flag onto a class Config: fully manually again. I also found it useful in catching subtle bugs in my (basic, learning purpose) cuda kernels. It is also nice to be able to do `def utc_to_ist(x: str) -> str`<TAB>.

                                                                                                                      As for whole apps, I never agree with its code style... ever. It also misses some human context. For example, sometimes, we avoid changing code in certain ways in certain modules so as to get it more easily reviewed by the CODEOWNER of that part of the codebase. I find it easier to just write the code "the way they would prefer" myself rather than explain it in a prompt to the LLM losslessly.

                                                                                                                      The best part of it is getting started. It quickly gives me a boilerplate to start something. Useful for procrastinators like me.

                                                                                                                      • greenie_beans 3 months ago
                                                                                                                        ai took away the fun of coding but it's impossible not to use it now that i've opened pandora's box. fortunately i don't have the identity problem. you shouldn't base your identity on your job. my problem is more like, "this isn't as much fun to do anymore"
                                                                                                                        • JaDogg 3 months ago
                                                                                                                          What I do is turn off copilot, do the design, get in the zone and enable copilot. This way I do not get it to slow me down, otherwise I just wait for it to do API calls. Even then, I turn it off when it create complete wrong implementations.

                                                                                                                          Problem is it doesn't know the layer of utilities we have and just rewrite everything from scratch. Which is too much duplicatation. I have to now delete it and type correct code again.

                                                                                                                          One advantage I have seen is that, it can defenitely translate / simplify what my collegues say, fix typos or partial-words. Which is very useful when you are working with lot of different people.

                                                                                                                          • zer8k 3 months ago
                                                                                                                            AI is the only reason I’ve been able to keep up with the constant death marches. Since the job market went to shit employers are piling on as much work as they can knowing they have indentured servants.

                                                                                                                            The code quality isn’t great but it’s a lot easier to have it write tests, and other code, and then go back and audit and clean.

                                                                                                                            Feels absolutely awful but whatever.

                                                                                                                            • MiiMe19 3 months ago
                                                                                                                              Over my dead body will I ever code with AI.
                                                                                                                              • kelseydh 3 months ago
                                                                                                                                - The dinosaur engineer exclaims as the AI asteroid strikes.
                                                                                                                                • chippiewill 3 months ago
                                                                                                                                  I'm sure that's what the horse-drawn carriage drivers said about cars.
                                                                                                                                  • FirmwareBurner 3 months ago
                                                                                                                                    Thank you for your service!
                                                                                                                                    • meitham 3 months ago
                                                                                                                                      That’s the spirit!
                                                                                                                                    • antfarm 3 months ago
                                                                                                                                      I'd rather become a bike messenger and continue programming for fun than having LLM code generators take over the fun part of my job.
                                                                                                                                      • slowtrek 3 months ago
                                                                                                                                        We could always surrender the identity. Who ever said a Software Engineer should be a profession that should last hundreds of years?
                                                                                                                                        • 3 months ago
                                                                                                                                          • ldjkfkdsjnv 3 months ago
                                                                                                                                            I think we are underestimating the change that is about to occur in this field. There is a certain type of mind that is good at programming, and the field will no longer reward that type of mind as AI takes over. The smaller, gritty details of making something work will be smoothed over. Other types of person will be able to build, and might even surpass traditional "10x engineers", as new skill sets will take precedence.
                                                                                                                                            • JaDogg 3 months ago
                                                                                                                                              There is no such a thing as a 10x engineer. Anyone who appears as 10x only do whatever that maintains that illusion. (don't help anyone else, don't do support, keep all knowledge in head, bad documentation, etc)
                                                                                                                                              • soulofmischief 3 months ago
                                                                                                                                                But many would list the things you've just listed as table stakes for any 10x engineer.
                                                                                                                                              • d_silin 3 months ago
                                                                                                                                                After a wave of AI-written slop floods the software supply chain, there will be even greater demand for 10x software engineers.
                                                                                                                                                • pjmlp 3 months ago
                                                                                                                                                  Offshoring adoption, and the low quality of related projects, has proven this is not the case.
                                                                                                                                                  • achierius 3 months ago
                                                                                                                                                    How so? The number of American software jobs is still way up from when people said software was "dead" thanks to offshoring. By something like 50x!
                                                                                                                                                  • ldjkfkdsjnv 3 months ago
                                                                                                                                                    This is cope, fixing ai slop will just be what software engineering becomes. Its the new requirements of the job, not a failure state
                                                                                                                                                    • 9dev 3 months ago
                                                                                                                                                      How are you going to tell spam from ham if you don’t understand the underlying systems and their constraints? And how are you going to gain that understanding if software engineering doesn’t value that anymore, and won’t educate people to gain it?

                                                                                                                                                      I dunno, it just doesn’t seem, like, all that thought out to me.

                                                                                                                                                      • hooverd 3 months ago
                                                                                                                                                        Eh, it's a great tool, but I'm still interested in understanding the world rather than proudly being incurious of it.
                                                                                                                                                  • twistedcheeslet 3 months ago
                                                                                                                                                    This is an excellent article.

                                                                                                                                                    We’re all just swimming as the AI wave comes crashing on every developer out there - we can keep swimming, dive or surf. Picking a strategy is necessary but it would probably be good to be able to do all of the above.

                                                                                                                                                    • n_ary 3 months ago
                                                                                                                                                      The article appears to be a rant and panic piece…

                                                                                                                                                      At this point, I am totally confused. When I attend expensive courses from Google or Amazon, the idea in the courses are that, tech has become sooo complex(I agree, look at the number of ways you can achieve something using aws infinite number of services), we need some code assistants which can quickly remind us of that one syntax or fill out the 10000th time of writing same boilerplate or quickly suggest a new library functions that would take several google searches and wading through bad documentations on another 5h of SEO spam or another 50 StackOverflow with same issue closed as not focused enough/duplicate/opinionated.

                                                                                                                                                      It is like, they want to sell you this new shiny tool. If anyone here remembers the early days of Jetbrains IDEs, the fans would whirl and IDE would freeze in middle of intellisense suggestion, but now those are buttery smooth and I actually feel sad when unable to access them.

                                                                                                                                                      Now, on the outside in news, media, blogs and what not, the marketing piece is being boosted a 1000x with all panic and horror, because a certain greedy people found that, only way to dissuade brilliant people from the field and not bootstrap next disrupters by signaling that they themselves will be obsolete.

                                                                                                                                                      Come to think of it, it is cheap now. First idea was to hire them when investments were cheap and disruption risk was high, then came the extinction of ZIRP when it was safe to stop hoarding them as no investment means less risk of disrupters, but if some dared, acquire and kill in the crib. Then came bad economy, so now it is easier to lay them off and smear their reputation so they can’t get the time of the day from deep pockets. Final effort is to threat the field by fake marketing and media campaign of them being replaced.

                                                                                                                                                      This panic drama needs to stop. First we had SysAdmins maintaining on-prem hardware and infra. But Aws/Gcp/Azure/Oracle came along to replace them to only move them up the chain and now we need dedicated IAM specialist, certified AWS architects, Certified Azure Cloud Consultants and what not.

                                                                                                                                                      Sorry for the incoherent rant, but these panic and “f*k you entitled avocado toast eating school dropout losers, now you’ll be so screwed” envy social media posts are so insane and gets so much weird, I am just baffled by the whole thing.

                                                                                                                                                      I don’t know what to believe, big tech telling me in their pretty courses and talks about how my productivity and code quality will be now improved, or the media and influencers telling me we are going to be so obsolete (and avocado toast eating dropout, which I am not). Only time will tell.

                                                                                                                                                      In the meantime, the more I see demos of impressive LLM building entire site from everyone and their pet hamster, the more number of Frontend engineering jobs popup daily on my inbox(I keep job alerts to watch market trends and dabble in topics that might interest me).

                                                                                                                                                      • hackburg 3 months ago
                                                                                                                                                        [dead]
                                                                                                                                                        • 3 months ago
                                                                                                                                                          • spo81rty 3 months ago
                                                                                                                                                            [flagged]