top of page

External High-Level View Of User Community Discussion About n8n

  • Writer: Harshal
    Harshal
  • 5 days ago
  • 5 min read

User Community Analysis Deep-Dive Example For PM Interviews

Here, I show a deep dive into a product memo example I did for n8n a few months ago, when I interviewed with them. Fast-forward, I got a job offer and will join them in Q4 2025. Since this doesn’t have n8n employee inputs, the information is not accurate, but I hope it gives you ideas on the approach. 

In earlier posts, I shared a framework to write a case study or product memo when you interview for Product Management roles with companies.

I read, scraped, and analyzed user discussions on GitHub, Reddit, Discourse, Fiverr, and Discord.

These insights may not be relevant anymore, as I researched and wrote these when I was not part of n8n. My goal here is to show you a deep-dive example for your Product Management interviews.

Raw notes when reviewing different communities.
Raw notes when reviewing different communities.

Related:

Platforms Reviewed

I spent time exploring what users were saying across GitHub, Reddit, Discord, Fiverr, and Discourse forums. My goal was not to do an exhaustive study but to go breadth-first, scanning different platforms to see what surfaced repeatedly.

Github Issues

I scraped the titles and brief summaries of the open and closed issues on GitHub. Then, I utilized AI to comprehend the sentiment and types of failures that users reported. Initially, I was pleased with my analysis, but later in my interview process, I discovered that a significant amount of user feedback is on the n8n official forum. I did not have time to analyze this additional information.

Themes From GitHub Open Issues (Still Affecting Users)

Summary: frequent tool execution failures, timeouts, schema mismatches, and memory gaps in AI agent workflows. Users often struggled with context not being preserved across steps.

  • Tool execution failures and timeouts: AI Agent nodes often fail silently or time out, especially on long-running operations. Affects usability of connected tools during complex workflows

  • Model compatibility breakages (especially Gemini): Failures when integrating newer LLMs like Gemini 2.0 and 2.5. Errors occur due to changes in provider-specific APIs or assumptions

  • Memory and state retention issues: AI Agent does not preserve memory across steps. Context passed between tool calls may be missing or outdated.

  • Broken input/output and schema mismatches: Mismatches between expected tool input and actual values. Results in unpredictable errors or workflow halts

  • Self-hosted user experience issues: Missing UI elements like “Ask AI”. Some tools or features behave differently in self-hosted or Docker setups

Themes From GitHub Closed Issues (Frequent Past Pain Points)

Summary: many related to broken dynamic inputs, schema errors in popular nodes (Google Sheets, Postgres, HTTP), incomplete outputs, and silent failures. Several reports highlighted self-hosted limitations, missing UI, and Docker-specific problems.

  • Broken dynamic inputs and variable handling: AI Agent reuses old inputs or fails to apply $fromAI dynamically. Creates confusion in multi-turn workflows

  • Schema errors and node compatibility problems: Tools like Postgres, Google Sheets, and HTTP nodes fail due to missing parameters. Schema mismatches prevent execution or require workarounds

  • Incomplete or empty outputs: Executions succeed technically but produce no visible result. Especially common in self-hosted environments where logs and UI diverge

  • Duplicate tool execution or failures in loops and subflows: AI Agents call tools multiple times or use stale inputs in iterative flows. Causes logic bugs in automation sequences

  • Broken agent dropdowns and setup flows after upgrades: Users report missing configuration options after version updates. Setup experience becomes inconsistent

  • Model-specific behavior mismatches: Incompatibilities with providers like Anthropic, OpenRouter, and Bedrock. Some require non-obvious configuration sequences to function

  • Self-hosted and Docker limitations: Issues like unsupported databases, memory failures, and missing UI. Errors surface only outside the cloud-hosted product

  • Tool recognition and silent failures: The AI Agent does not recognize or connect to attached tools. Tool invocation appears successful but yields no output.

  • Tool response formatting issues: JSON or binary outputs cause parse failures or formatting errors. Outputs are malformed or silently dropped.

  • Setup confusion and incomplete reports: Many issues were closed due to missing details, billing errors, or misconfigurations. Indicates a need for better error messages and user guidance.

Reddit

  • Strong interest in self-hosting: users run n8n on Hostinger, Hetzner, or GCP VMs instead of cloud hosting. I’m concerned whether those Virtual Private Cloud (VPC) platforms are growing on the back of tools like n8n.

  • Many share flow ideas for businesses, showing the platform’s versatility and commercial use. Example: one Redditor made $800 from a single consultative project, sparked by their tutorial video.

  • Frequent discussions about rate limiting and free-tier APIs. Users want better model management to stay within quotas.

  • Some posts emphasize that AI agents work as POCs, but struggle in production due to input errors, QPS issues, and missing logs. Technical users praise n8n as a fast POC tool, while also wanting better debugging and monitoring.

Discord

  • Community members discuss how self-hosting on VPS (like Hetzner for ~$6/month) is cheaper than paying for n8n cloud.

  • Huge community: at the time I checked, there were 6,000 online and 42,000 total members.

  • Discord had active chatter, but n8n later told users to move to the Discourse forum instead of real-time chat for long-term searchability and support.

Fiverr

  • I saw over 6,700 gigs related to n8n automation. Many freelancers offer workflow setup, integrations, and troubleshooting.

  • Demonstrates a micro-economy around n8n expertise, where community knowledge directly converts into paid services.

Discourse Forum

  • Feedback threads pointed to UX improvements needed in search, AI assistant, and node builder flows.

  • Users often struggled with setup confusion and incomplete error reports, leading to feature requests for clearer guidance.

  • The forum also served as a place to ask “who is n8n for,” with both technical and non-technical users debating target personas.

What Surprised Me

  • The use cases range from simple hobbyist experiments to complex workflows that power businesses.

  • N8n has a community-driven economy, where tutorials and expertise can generate real income.

  • Many users are concerned about reliability and trust when transitioning from a demo to a production environment.

Rest Of The Product Memo

You can read the rest of the product memo here.

As I reflect on this section:

  • Looking at communities gives a raw view of user struggles and wins.

  • Not every post represents the majority, but repeated pain points indicate systemic problems.

  • For interview prep, analyzing forums and community chatter is a practical way to ground product insights.

Related:

bottom of page