Ask HN: Are you worried your non-AI project might soon be obsolete?

16 points by tablatom 2 years ago | 22 comments
I'm not so worried that AI will become so good at coding that I'll be out of a job. It's not clear it's going to happen any time soon, and if it does, the implications for all sorts of jobs are so vast it's a totally different world.

But there's a different concern, that the era of millions apps might be coming to an end, and with it the viability of many of us freelancers and small software companies. Instead we will have a small number of incredibly capable and malleable platforms.

And it doesn't have to get all the way to that endgame for the app I work on to become redundant.

  • muzani 2 years ago
    I made https://random-character.com

    Yeah, it's the perfect use case for generative AI. I worked on it because it seemed like a hard and niche enough problem, and yet not important enough that someone else would try to solve.

    It follows certain patterns, aka tropes. Story tropes have been around since Classical Greek civilization. It uses procedural generation to make sure that characterizations don't conflict. It uses some techniques to make it easy to get into the character rather than just being a third party description.

    As you can see, it's far from perfect. Search traffic has gone down about 50% but direct users have gone up by about 30%. So it does appeal to a niche.

    I've been using GPT-3 as a kind of "pre-render" because it's so damn tiring to write descriptions of someone being brave. But it's not quite good enough, and ChatGPT can just do the whole thing quickly and for almost free.

    ChatGPT has a huge weakness here though - it doesn't quite know what makes an interesting character, and usually the people asking it don't know either. It's about setting up tension. There's intention and obstacle. Plots and characters usually need to be simplified to make the story easier to follow.

    With GPT-3.5 being so cheap, we could actually plug in AI directly here on a free app. But at 5 cents per character it still adds up to $100/month.

    Then again, there's interesting stuff, like more directly relevant content rather than this genericness. It can generate incredible detail on articles of clothing. And so on.

    • vmc_7645 2 years ago
      I'm even worried about my AI project because these larger companies just have so much more in terms of resources. Unless I find some novel methodology that's difficult to replicate without its explicit capability my product is dead in the water.
      • james-revisoai 2 years ago
        Same boat. I do fear this. Like you it sounds, I have worked on an app that was data-intensive and used older machine learning techniques(well from 2018-2021... BERT, T5 models, OCR, since outdated), and main existing competitors being able to deploy GPT cheaply undoubtedly negatively hit our business.

        It also decreases in my opinion, the likelihood and value of acquisitions of startups, at least those easily substituted with an LLM.

        I think there is strategy or tangents where LLMs aren't as effective, but these are harder to demo or convince investors of. it's a tricky time.

      • polalavik 2 years ago
        Nope. I've played around a lot with generating code for MATLAB, Python, and signal processing. It gets things wrong 90% of the time, but gets you maybe 30-50% to a solution. I'm no doomer about AI because writing good software that makes sense and works is still a human task for a long while.

        Gluing multiple systems together for a full project - whether it be web, CI/CD /devops style problems, or large systems engineering problems will surely almost never be replaced by AI. The larger the scale the more amount of places to be wrong, the more BS to debug if you get AI to generate it because it will have a million issues and nobody will know where to start looking.

        • admissionsguy 2 years ago
          Yes, I am worried about it. However, so far the impact has been positive, with GPT enabling me to expand the scope of my project that can be achieved with the same amount of work.

          I feel a sense of urgency to generate wealth before AIs become capable enough that those who didn’t make it become stuck in some sort of UBI limbo.

          On the other hand, when trying to apply GPT-3.5 and 4 to do useful work, their limitations become so glaring that I think most takes on the internet in the vein of "chatGPT made an app" are incredibly naive.

          • 462436347 2 years ago
            > I feel a sense of urgency to generate wealth before AIs become capable enough that those who didn’t make it become stuck in some sort of UBI limbo.

            The sense of urgency you describe is exactly how I feel.

          • akasakahakada 2 years ago
            GPT-4 still suck at scientific programming where you have to invent every wheel from scratch. It will be fine if you consider generic unoptimized code productive.

            Maybe all we need is to finetune a LLM for code optimization. But first you need a large enough codebase handcrafted by phds. But again not many phd are even qualified for performant code.

            • mdp2021 2 years ago
              So you mean: "non-AI /aided/ project", i.e. using some algorithmic (or meta-algorithmic) writer to output drafts of the code.

              There will be more competition, which means both more accessibility and more bad quality.

              On your side, you may have some aid to be more productive - /may/. The projects you are working on can be "redundant" only in terms of "becoming part of a number of other spawning projects doing similar things, owing to new global levels of accessibility, and competing with a lot of noise, owing to new global levels of bad quality outputs".

              • tablatom 2 years ago
                > So you mean: "non-AI /aided/ project", i.e. using some algorithmic (or meta-algorithmic) writer to output drafts of the code.

                No, I meant apps that don't use deep learning, LLMs etc internally.

                • mdp2021 2 years ago
                  It makes no sense: it depends on the product, really. Prof. Patrick Winston used to say, ten years ago: "Machine Learning does what we do deterministically, but badly". Now, scale and new techniques at this stage have enabled ML to achieve amazing outputs, in areas where we could not get similar results deterministically. That does not mean that it should be used for everything. You are not supposed to use hammers instead of screwdrivers just because hammers become fashionable.

                  You should probably clarify your concern.

              • chewz 2 years ago
                I don't...

                First I am writing code for my own pleasure. I am doing it because I like putting my thoughts in this form...

                Second AI will first eliminate derivative work, script kiddies, sweatshops etc. I won't be missing them.

                • mdp2021 2 years ago
                  > derivative work, script kiddies

                  I am not sure why or in which way you are saying that: I'd say that kind of activity will be boosted.

                  • chewz 2 years ago
                    > that kind of activity will be boosted

                    but there would less human programmers behind it...

                    • mdp2021 2 years ago
                      > less human programmers

                      I would guess the substantial opposite: maybe less "real" programmers, probably more people. I wrote nearby, «There will be ... more accessibility[ - ]and more bad quality».

                • SeanAnderson 2 years ago
                  Yeah, kinda. It's not even really that related, but I've been tinkering around with indie game development and I just can't shake the feeling that the quality of experience waiting to be delivered through emergent interactions with AI agents is going to make whatever I cobble together look trite in comparison. Time will tell, though.
                  • johlits 2 years ago
                    Went through a list of open APIs and I can see a lot of them becoming irrelevant in the future. So yeah, we should "AI-proof" our projects.
                    • anthonyhn 2 years ago
                      I work on a search engine in my spare time as a side project. What I've learned from working on a search engine is that even though the recent GPT models are quite good for general purpose search, there are still ample opportunities in search, and generally not enough people looking at search to cover everything.
                      • rainytuesday 2 years ago
                        I saw a podcaster say "3 of my app ideas are now obsolete". I had a half-way decent travel app idea that was made immediately obsolete. Not only apps, but what about extensions -- like VSTO (visual studio tools for office) -- you add GPT to MS Office and we don't need people programming extensions and add-ons.
                        • mattbgates 2 years ago
                          AI couldn't replace the projects I work on, but it definitely has been a great assistant, and I've even had it build websites and write code for me.
                          • richardjam73 2 years ago
                            Sometimes I feel like I live in an entirely different universe from other HN users and posts like this reinforce that view.
                            • mdmglr 2 years ago
                              Can you elaborate more?
                              • richardjam73 2 years ago
                                It is mostly about peoples motivations for programming and the complexity of their projects. Although I would not mind making money off of my projects that is not why I start them, also my projects are often quite complicated and don't revolve around web tech that much.