This is my first article in a few years, and it comes with a major visual overhaul of my whole site. That felt like a good excuse to tell the backstory: how a spontaneous decision over one weekend led me to revive the blog in a totally new look – and to hunt for the key question: does using AI to write code make you less of a developer?

The site, for years
Let me start with a bit of history. This year, as I’ve counted, marks exactly 20 years since I first put a website out into the world. I initially built it in Word (yes, really!), then moved on to FrontPage. WYSIWYG – What You See Is What You Get – was the umbrella term for pretty much all the visual editors used to build simple websites back then. There were quite a few of them, all promising massive shortcuts, yet even with their help, building websites was never entirely trivial. With one interesting exception, at least from today’s perspective: making things work in Internet Explorer (the Chrome monopolist of its era, except it didn’t even respect its own web standards).
Some time later I got curious about content management systems (CMS), including one of the more popular options of that era: PHP-Fusion. It was a small technical miracle, but free hosting limits made editing code impossible. Annoyed by those limits, I got serious about programming. I wanted a dynamically managed site that I’d build from scratch.
Armed with a little PHP programming booklet from Computer Bild, in October 2008, I launched a new site. Over the following months and years, I kept tweaking it – and diligently documented every change in short articles. The desire to add new features was the driving force behind continued learning, and the results simply brought me joy.
For nearly 18 years, the site ran largely on the core I had written at the very beginning. Over time, I implemented user accounts, a comment system, polls, social media integrations, visitor analytics, and even a custom admin panel. The last major overhaul, in 2015, brought a solid redesign along with mobile support.
Status quo analysis
Since 2018, I hadn’t published a single new article. Why? Among the many reasons, here are the most technical ones:
- the admin panel, while it did make adding articles easier, was far from ideal – it lacked proper image management, editing, and uploading;
- article categories were too rigid, deeply baked into the site’s structure, and not easy to change;
- the homepage felt more like a portal than a personal site;
- there was no straightforward way to make the blog multilingual.
“AdminPanel” served me for years as a quick way to manage site content
Weighing the pros and cons of continuing to maintain 18-year-old legacy code, I started by diving back into what was actually in there. Well – it was painful to read, but that’s usually a good sign. Anyone (not just developers, though it’s especially true in this field) who revisits their own work years later, with fresh eyes and no emotional attachment, knows the feeling.
The conclusion was obvious: the only real value was the historical archive of articles, scattered across many files and structures. Decision: keep the archive, change everything else.
New requirements
Knowing I’d need an entirely new content system, I began researching what was already out there – what proven, existing solutions I could build upon. For that kind of groundwork I lean on AI chat tools. To be specific: at least two of them. My first pick is usually Claude, the second – Gemini. Just in case, for cross-checking the answers I get.
I crafted my prompt fairly specific, starting from expectations – that the whole thing should above all be easy to maintain – down to technical constraints, like technical constraints about the hosting environment. I use shared hosting, because it’s one of the cheaper options, there’s plenty of competition on the market, and frankly, for a simple personal site you don’t need much more. I reject the paradigm that values cleverness over usefulness. I write my prompts in a mix of Polish and English, leaning toward English for purely technical matters.
The initial research ruled out PHP and offered two frameworks instead: Hugo and Eleventy. Both well-supported by the community, simple to use, easy to modify, and most importantly: they store content in Markdown (which is great!).
It’s worth pausing for a moment on the decision to drop PHP. Why had I been building sites in this technology, as a first choice? Momentum played a big role. Competing alternatives were either immature or nonexistent, PHP’s community support was substantial, and the ability to easily mix HTML with PHP gave me exactly what I needed: dynamically generated content, reusable page fragments (include()), and added basic user interaction.
Those needs mattered less now. What I wanted now was a simple portfolio site with a blog, much less social functionality (no built-in comments, at least for now), and a path to add features later if needed.
I chose Eleventy for its straightforward templating and JavaScript-based site generation. Hugo, on the other hand, uses a Golang engine. I didn’t see any real advantage there, especially since migrating between the mentioned two, wouldn’t be too painful given the current level of complexity. And picking a technology just because it sounds impressive is worse than making a bad choice.
Setup and deployment test
Choosing a framework is typically only half the battle. The other half is figuring out how the whole thing will actually build and deploy to a server. I want to see the whole picture (even if only in my mind’s eye) before anything gets implemented. Today we have DevOps tooling that simply didn’t exist a dozen years ago – or if it did, it was too expensive for everyday use.
Source code hosting – GitHub. Building the site files – GitHub Actions, which in its free tier provides more than enough resources for an individual’s needs. Pushing the site files to a remote FTP server – also GitHub Actions. This foundation gives you stability and independence at the same time. If anything change on GitHub’s end (pricing or the service itself), the site can be built locally with a single command. The dependency on GitHub is, therefore, at an acceptable level.
Core functionality
The next step was the actual implementation of the new site. I created several agents in Claude Code, such as:
- UX/UI Designer,
- HTML/JavaScript/CSS Developer,
- Reviewer.
This lineup seemed sufficient from the start – and it proved to be exactly that.
The key to success is the ability to articulate your thoughts clearly and unambiguously.
The more precise the prompt, and the more the specification covers not just the main scenario but also the edge cases, the faster the results match what you actually expect. In my prompts, I focused first on solving the architecture and structure, and only then on everything else – namely the visual layer.
My requirements were a bit unusual. I wanted the homepage in one language (English), but the blog itself to be bilingual (mainly in Polish). And with backward compatibility – meaning not every article, originally written in Polish, would need an English translation. The reason is simple: I have no intention of automating the translation of my articles, because I want full control over the process.
This led me to a straightforward article structure on the Eleventy side. Each article is a separate directory inside the blog folder. Its name matters, as it becomes the Polish slug. For the sake of consistency, I decided to prefix each folder name with a datestamp (the absence of a timestamp is intentional – I don’t need to publish more than one article per day). Within each folder, one file is required: index.md, containing the Polish article, and optionally index_en.md for the English version. The folder may also contain additional files, such as images, which can (but don’t have to) be shared between the two language versions.
Each Markdown file has its own front matter, as required by the template. And here, too, simplicity and avoiding redundancy are key. Do we need to define the Polish slug again, or is it enough to only specify it in the English version? Can values from the base (Polish) article serve as defaults? These are the kinds of questions worth asking to keep the solution as simple as possible.
I didn’t want the URL to include a language interfix (like /pl/ or /en/) because… it clashed visually with my .pl domain. A small thing, but crucial when defining requirements.
To me this is what vibe engineering is about – making deliberate decisions and thinking ahead while carrying out a task, rather than blindly following the first solution that comes from a half-baked prompt. That latter approach is vibe coding, which may seem to deliver good results in the short run, but without an engineering framework, will inevitably lead to a technical disaster in a long term.
Righting a disastrous architecture later would be painfully expensive in both time and money.
UI/UX Design
Once the technical layer was in place, I focused on visuals. My design brief was simple: default to dark mode with an option to switch to light, refresh the logo, and pick a color palette.
Here, another agent – the UI/UX Designer – produced an excellent starting point, thanks to precise requirements. And in a matter like this, technical imagination is essential. The result you can see on the site today is largely the product of the initial iterations; later changes were mostly polishing.
It’s worth mentioning that when designing a site this small, there’s usually no need to reach for prototyping tools like Figma. I think I would have spent far more time on prototyping than on simply pouring my thoughts into a detailed prompt.
Reviewing changes
As I mentioned, I also set up a reviewer agent for going over successive changes – both in the code and in the design. One good habit I developed was regularly checking the site’s correctness using this agent. This also introduces another responsibility: keeping the code clean and not leaving behind any unused leftovers that tend to accumulate across iterations.
Migrating old articles
The final step was migrating the old article archive, along with all media, to its new home. As I mentioned at the beginning, the original database was poorly designed, with data scattered across various locations. Additionally, the HTML content had been heavily shaped by legacy styles. It was messy, but manageable.
The key was to plan a script that would do the heavy lifting: merging the data and organizing it into the new file structure. To simplify the task, I cleaned up all the necessary files and created a new, somewhat simplified temporary structure specifically for the script. I think this kind of preparation is quite important.
I didn’t want to run all articles through LLMs just to get a working script. Instead, I picked examples, including edge cases, and provided the script author with a description of the source and target structures and the conversion rules from HTML to Markdown. The script did its job. The final step was verifying the process. Simple logging of source files and their target paths proved enough to spot orphans and missing assets.
So, is vibe engineering truly nothing to be ashamed of?
Everything I described above was entirely doable within a single weekend, a few hours of work per day. From a spontaneous idea to full execution and complete migration.
This represents a fundamental shift in my approach to writing software. Repetitive tasks become faster and less tedious, but one thing hasn’t changed – for all of this to have real value, rather than being just a hollow shell of software, you need oversight and a clear vision of what you want to achieve and how.
The pace of change in artificial intelligence is accelerating, yet the actual adoption and daily use of this technology remain surprisingly low.
Global AI adoption as of February 2026 (source: X/@damianplayer)
Surprisingly low for someone who works with this tech practically every day.
So, to keep this article from aging poorly, it’s worth noting in closing that it’s based on my experiences from early 2026.
Is vibe engineering, then – following the question posed in the title – something to be ashamed of for people like me? I don’t think so. For a simple reason: this technique, like many others, helps you achieve specific goals more effectively – even something as ordinary as redesigning and rewriting my website. What matters, though, is that an engineering mindset is still required: defining the requirements, specifying them clearly, and validating the results even more carefully than before.