Pilot Evaluation Report: What we learned from testing ENTER-CBL in real classrooms
Designing an innovative entrepreneurship course is one thing. Delivering it with real students, in different countries, under real academic constraints - that is where learning happens for everyone: students, teachers, and the project team.
That is why ENTER-CBL ran a full pilot implementation of the NextGen Entrepreneurs learning path and then evaluated it carefully. The Pilot Evaluation Report brings together participant feedback and platform usage evidence to answer a simple question:
What worked well, what needs improvement, and how can we scale Challenge-Based Learning in entrepreneurship education in a practical, sustainable way?
What was piloted
The pilot tested how the CBL approach performs in entrepreneurship education when supported by an LMS-based learning environment (the ENTER-CBL Hub). Students worked in teams, developed solutions to real challenges, produced learning artefacts, and progressed through a structured learning pathway supported by resources, tasks, and clear expectations.
How the evaluation was conducted
The evaluation combined two short surveys and platform-based evidence. In addition, platform analytics captured engagement and completion patterns across institutions during the pilot.
What participants appreciated most
The evaluation confirms a strong message: the pilot was experienced as a valuable and meaningful learning journey, especially because it moved beyond theory into action. Participants most often valued the practical, tool-based nature of the learning path and the clear connection to real entrepreneurship work.
- Practical frameworks and tools that helped structure entrepreneurial thinking (e.g., planning and modelling tools, structured reflection and decision-making techniques).
- Teamwork-based learning that strengthened collaboration, communication, and leadership skills.
- A sustainability perspective that encouraged students to connect entrepreneurship with wider social and environmental priorities.
What the platform data showed
The platform evidence complements participant feedback by showing how the pilot worked operationally. A total of 237 users completed activities on the platform. Engagement levels varied between institutions, which is expected in an international pilot, but the overall average activity completion rate reached 66%.
This matters because CBL generates many learning outputs: challenge definitions, research evidence, iterations, prototypes, and reflections. An LMS can make this process easier to manage by keeping learning organised, visible, and measurable - without reducing CBL to “just content consumption.”
What should be improved before wider scaling
The report also provides clear recommendations for strengthening the learning experience. The most consistent improvement needs included:
- More real-world examples: stronger use of case studies and practical examples to help learners connect concepts to concrete entrepreneurship contexts.
- Clearer structure and workload balance: condensing some parts, improving clarity of instructions, and reducing repetition to keep the flow more focused.
- More interaction points: additional group activities, peer feedback, and guided discussion to increase collaboration and motivation.
- Sharper guidance on sustainability concepts: supporting learners in applying sustainability and related concepts more consistently in business model decisions.
What happens next
The Pilot Evaluation Report is not a “closing document.” It is a practical improvement tool. The findings are being used to refine the course flow, strengthen facilitation guidance, and improve the platform-supported learning experience so that future deliveries are even more consistent, manageable, and impactful.