Skip to main content
At DrupalCon Vienna 2025, Dries Buytaert presented a big new feature: Drupal Canvas. It is a visual page builder in Drupal that allows users to drag and drop to design layouts without touching templates or code. At Lemberg Solutions, we wanted to try it out and show its capabilities to clients as quickly as possible. Instead of setting it up locally, we decided to deploy it live on Upsun (formerly Platform.sh), so anyone could open it in a browser and explore it firsthand. In this article, we will take you through a step-by-step process, highlight the issues we faced, share the lessons we learned, and reveal how long this task actually took.

How our journey looked like: Step-by-step process

Step 1: Write a detailed prompt for AI code generation

We started in the Cursor IDE with a simple prompt for AI, asking it to act as a senior Drupal and DevOps engineer. We provided all the details that the AI assistant needed to prepare a demo environment. Below, you can see our prompt:
You are a senior Drupal + DevOps engineer helping me prepare a client-facing demo environment for Drupal Canvas on Upsun (formerly Platform.sh).

Drupal Canvas was presented as a major new visual page-building feature in Dries Buytaert’s “State of Drupal” keynote at DrupalCon Vienna 2025.

There is an official demo project here:

        Blog post: https://dri.es/state-of-drupal-presentation-october-2025
        Demo repo: https://github.com/phenaproxima/canvas-demo

I want to adapt this demo to Upsun so that I can spin up short-lived demo environments for clients (preview links), not build a real production site.
AI prompt in Cursor IDE

Problem #1: The wrong package

The first issue we faced is that our AI assistant picked up outdated information and used the wrong Composer package name. Specifically, it installed drupal/experience_builder instead of drupal/canvas. This happened because during DrupalCon, the module was still referred to as “Experience Builder” in some places, which confused the tool. Using the wrong package can lead to serious compatibility issues later. Lesson learned: Always double-check Composer package names on drupal.org before hitting “Deploy.” AI knowledge isn’t always up to date, so it can rely on irrelevant and old data. A quick manual check can save hours of debugging.

Step 2: Check project structure

After AI created the Drupal structure, ready for Upsun deployment, we spent some time ensuring that the structure of the project had all the needed files. Our final files and folder structure looked like this: Final project structure

Problem #2: Excessive output from AI

Our AI assistant wanted to be helpful, but went overboard — during the process, it generated 18 extra files. For a quick demo, all we needed was a single README. But it created eight documentation files, multiple setup scripts, and even a “Quickstart Guide” for tools we never planned to use. Lesson learned: When prompting AI, be crystal clear about your intent and what the AI tool should deliver. Don’t say “set up Drupal Canvas on Upsun”, instead, say “set up Drupal Canvas on Platform.sh with DDEV, one README, minimal structure.” Redundant files generated by AI that were deleted in the end

Step 3: Check platform configurations

Our project configuration originally lived in .upsun/config.yaml. It defined a single PHP application with a MariaDB service and pointed web/ as the document root. That’s where the next issue appeared.

Problem #3: Platform.sh vs Upsun: name confusion

Since Platform.sh recently rebranded to Upsun, the AI tool was confused and treated them as two separate platforms. As a result, it created two setups: Upsun files in .upsun/config.yaml and Platform.sh files in .platform.app.yaml, along with the entire .platform/ directory. To avoid confusion and duplication, we manually deleted one set of files. Lesson learned: Check your git remote and ensure that your project configuration matches the platform you are actually deploying to. Rebrands might confuse AI.

Step 4: Build and deploy hooks

We added simple hooks that allow the site to install itself automatically during deployment. Now, every new environment on Upsun builds a fresh Drupal Canvas instance, with no manual configuration setup required.
deploy: |

    set -eux

    # Check if site is already installed

    if ! vendor/bin/drush status bootstrap 2>/dev/null | grep -q 

"Successful"; then

      echo "Installing Drupal site..."

      vendor/bin/drush site:install standard \

        --site-name="Drupal Canvas Demo" \
        --yes

      echo "Enabling Drupal Canvas module..."

      vendor/bin/drush pm:enable canvas --yes

      echo "Site installed successfully!"

    else

    echo "Site already installed, running updates..."

      vendor/bin/drush updatedb --yes
      vendor/bin/drush config:import --yes || true
      vendor/bin/drush cache:rebuild
    fi

    echo "Deployment complete!"

Problem #4: Wrong Drupal profile

On the first deployment, we noticed that the AI generated the wrong installation profile name. It tried to install canvas_demo profile, which doesn’t exist. Lesson learned: After this failure, we switched the installation command to use the standard profile and then explicitly enabled the canvas module inside the deploy hook. That small fix finally made the deployment fully automated and reliable. Platform.sh build log showing “The profile canvas_demo does not exist” error

The final result

After all the iterations and fixes, we ended up with a clean, lightweight Drupal Canvas demo, deployed on Upsun and running locally with DDEV. While we didn’t get instant success, now, after several iterations, the demo finally works reliably, and it’s easy to reuse for client demos. Final Drupal Canvas page on Upsun What was the actual timeline for development? Here is a clear breakdown of the time we spent setting up the Drupal Canvas demo:
PhaseTimeKey actions
Initial setup3 minAI-generated base configs
First deploy errors10 minConfig + permission fixes
Docker troubleshooting15 minBuild failures, switched tools
DDEV migration7 minMigration completed successfully
Wrong package fix20 minReplaced experience_builder with canvas
Cleanup & simplification30 minDeleted docs, folders, cleaned git, final review
Total time1 h 25 minFrom start to stable demo
All in all, AI wrote the initial configurations in about 20 minutes. However, 70% of the work still fell on the human developers, who spent nearly an hour reviewing, fixing, and simplifying the code.

Key takeaways

Working with the AI assistant revealed to us several critical lessons that we want to share. First, start small and keep it simple. The best way to cooperate with the AI tool is to give well-defined and simple tasks at the beginning. Slowly add complexity within the time. This helps AI to be more accurate and keeps developers from getting stuck on task details. Second, provide not only context but also concrete examples. If you think you shared enough context at the start, for better results, you may need to write more follow-up prompts with clear examples. This will allow the AI assistant to understand your expectations and deliver results that align with your idea. Third, human review matters. AI may write code fast, but as you can see above, most of the time it needs human oversight. People still must be in charge of the whole decision-making process. For more insight into using AI for development, read our article on vibe coding. There we explore which companies can benefit the most from this approach, how to choose the right tools and maintain security.
Last modified on April 14, 2026