Learning Programming in 2026
Intro
We are at the end of 2025, so it's time to make some plans for 2026. And these plans for many people depend on the question: is learning programming still a thing? I'm not working as a developer and I'm not discussing the trend where lots of juniors have been dismissed and few are hired, with senior developers still in demand, though. I'm talking about the value of thinking and problem-solving, as well as the ability to understand what's going on with computer systems and make the right decisions.
Almost like how security rules are written with blood, history is not about having a visionary and going in the right direction (although that was true with Unix and C++). It's often about testing all the variants and learning "the hard way" how they fail, so I'll bring some real-life "Halloween" stories to your attention.
The wrong decisions
The first story happened to me in my first job. We had a team of three people: a project manager, a developer, and me as a consultant. We were implementing Microsoft Dynamics AX ERP system (now called Microsoft 365 Finance and Operations), so it was around 2006. The situation was not standard: we needed to inject money into logistics transactions and the financial contour of the system. It turned out that our project manager had a plan which was completely unfeasible, but he didn't share it with his team. The takeaway is not about teamwork though – it's about planning something when at least someone knows how it works under the hood.
It's no surprise this happens again: making decisions when no one actually knows what's going on. I've been in hundreds if not thousands of meetings, and the most frustrating ones are when no one understands what's going on and, in their ignorance, doesn't want to invite a subject matter expert to clarify the issue – because, well, it would reveal their incompetence. That can lead to anything from an incident to a disaster, and that's what the next two cases are about. The new idea, common for both cases, is the formula "incompetent person + language model = competent person," which doesn’t work out at all:
- The Tea app, a social network for women, faced a massive personal data leak after boasting about extensive use of language-model-generated code.
- The AWS hub outage, noticed by almost everyone using the Internet: again, AWS had heavily promoted using language-model-generated code before that. A software bug was cited as the reason for the outage – I don't believe that in 2025 qualified programmers and testers are unable to push reliable code to production, but language models certainly are not able to do that.
The trend toward language-model-generated code is like a locomotive – impossible to stop right now – so we'll see more outages for sure. Another funny thing is that some developers advertise themselves as "re-writing" AI-generated code, which is exhausting, by the way. You face at least 2–4x more code than there should be – code implementing features no one asked for, and not implementing security or reliability features... well, because no one asked for them.
What is programming
I'd like to redefine programming, and my definition is not in line with a basic Computer Science (CS) course, regretfully:
- It’s about learning a programming language, but you need to learn at least one more language other than Python. Python is concise and simple, but it's so similar to human language, and the interpreter is so friendly in reporting errors, that it feels almost like cheating. With not enough resistance, you don’t get enough exercise, which is what working with computers is about – developing a certain level of problem-solving ability. So, I suggest learning something harder like C++, Java, or JavaScript.
- It’s about learning markup languages as well: HTML, Markdown, LaTeX. The ability to see structure when you read markup code is important, and these languages are the basis of modern IT. After learning front-end, I also changed my perspective on user interfaces: the back-end defines what capabilities you have, but representation matters, and you should at least have an opinion on front-end technologies like Angular, Vue, or React.
- It’s about learning SQL, which is included in CS courses, as logic is often not about transforming data but about storing and extracting it – and that may not be obvious.
- It’s about learning the basics of how the Internet and networks in general work. What are the protocol layers? Why would you prefer the Internet over a local network? When computers start to interact with each other, security questions arise: what level of security are we able to achieve and promise to our customers?
- It’s about learning the basics of operating systems – how they function, what layers they have, and why Linux is better (just joking). Working in IT implies having some outlook, knowing some history: what about microkernel architecture, for example? And history, especially computer history, has a lot of lessons to learn from – why something "ugly" works in the long term, and some architecturally beautiful or overhyped concepts fail.
And even that’s not enough: there are programming patterns, architectures, and specific security vulnerabilities, which make the features visible to the user only about 10% of the iceberg. So yes, programming is hard.
What about AI
Sorry for repeating myself, but I'll write it again just in case you don’t know: we don’t have AI yet. Just listen to the podcast with Richard Sutton: language models use knowledge – they are unable to learn, even like animals do. Yes, they can pass Turing’s test, but it’s still emulation of intellect rather than having it. True intellect is called AGI (artificial general intelligence). Forecasts vary – one of the latest episodes of the same podcast with Andrej Karpathy is called "AGI is still a decade away." I’m not sure if I’m going to listen to it – it sounds pessimistic. Or optimistic – depending on how you look at the path behind.
What professions will disappear
I'd recommend listening to the podcast with Elon Musk, although it’s boring at times. Elon states that all professions working with computers will disappear. I’d say that working with requirements is the most difficult part, even for humans. In the real world, product managers and project managers do this job. Being one of them, I can see the demand falling. Can developers do this job? Yes, sure. The great devs I know were great at working with requirements (and architecture). The work model just tends to be more primitive now: creating a prototype at each step and moving forward based on feedback. It’s not that bad – if "the managers" are so expensive, this model can even be more cost-effective.
A side history quest
The history of IT is rather short, but I'd like to ask a question: when did humanity make progress in general? I believe it happened when highly talented and motivated people (self-)assembled in teams and faced challenging tasks. For me, the case "Windows vs. Linux" is clear and closed – and that’s what happened (in my humble opinion). In the early days of Linux, FreeBSD was also a thing, but for me, the turning point from FreeBSD to Linux was the journaling system ReiserFS, which boosted filesystem performance on Linux. Windows filesystem performance looked just ridiculous, and from a performance viewpoint, Windows made no sense because of its outdated multitasking model. So in this case, engineers (or developers) won over "managers," whether they were managing products, projects, finances, or marketing.
What’s happening now is like "the managers" striking back using language models to get rid of devs. Look at Windows now: it’s clearly a follower, copying Linux tech – be it WSL (Windows Subsystem for Linux) or PowerShell with its CLI approach (which existed before Linux, I must say). To cover Windows’ grave with asphalt, they even introduced ads in Windows – so even more Windows supporters are pissed off.
It’s not "devs vs. managers" or "devs vs. language models." It’s "true vs. fake."
What’s intelligence
I think intelligence reveals itself in solving problems, and there’s no other way to train intelligence than by solving problems – always raising the bar. Programming is great for training intelligence because, in real life, you can cut corners: people can abandon their own requirements if you’re convincing enough, you can motivate people to work overtime to compensate for bad planning, and so on. IT is ideal in terms of "strict requirements": you’re unable to cut corners there – it works or it doesn’t. It also teaches a very important concept of "good enough," which I think is underappreciated in real life.
On the other hand, if you are suffering while learning, you’re probably on the right path, because it means you have a challenging task. By the way, choosing a "good enough" solution can also be quite painful for us perfectionists. :) That’s the reason why people choose to write programs using a language literally called "Brainf*ck," or use Raspberry Pi – a computer with very limited resources. Because you raise the bar – not by making the task more complex, but by limiting the resources or the toolset. And it seems that it works.
What are the challenges for programming
Okay, what if you’re trying to create an app and you’re not taking the fast, easy, and wrong path of generating code with a language model? Let’s start with the requirements:
- You need to deliver fast because you still compete with language models :)
- You need to deliver cheap for the same reason – and I’m not talking about outsourcing here.
- Your code needs to be stable and secure.
I think choosing the right framework is the right place to start. It shouldn’t be too fresh and undocumented, but it should leverage recent advancements to... well, write less code.
Let’s take an example and talk about a web app. Django, Ruby on Rails, and Laravel gained popularity for a reason – because (and DHH was right here) they solve an "end-to-end" problem. In the JS world, it’s a little more chaotic and "DIY"-like; I’ve actually lost track of what’s going on. :) It’s bad to start learning programming with a framework, but it’s so nice when you already have one you need. And I’m really inspired by how wide the choice is now, especially compared to the situation just a few years ago. Anyway, code generators within frameworks – when you know what you’re doing – are far better than copying code from a chatbot. Please, for the sake of humanity, don’t do that. :)
Conclusion
Something else inspires me even more than the progress with the hardware and software we have compared to 1999, when I started playing with FreeBSD (and everybody was terrified of the “year 2000 problem,” ha-ha). It’s how much you have to play with now, and how many resources you have to satisfy your curiosity. Codecademy doesn’t pay me, but it’s hard not to mention their free resources to learn coding in any language, including Rust, Kotlin, and Bash. And this experience – to feel what’s on the edge of software development right now – is quite satisfying.