

Curriculum Development Process
I take a comprehensive approach to designing and developing learning experiences by using the ADDIE Model, Backwards Design, seeking feedback from subject matter experts, and implementing evaluation and iteration strategies. Check out my process below. Samples included.
Needs Analysis: Define the Problem and Identify the Solution
Problem: The training is for an advertising and marketing technology company that seeks to equip novice and seasoned users of their platform with technical training for new product features in alignment with Product Management and GTM processes. The company wants the training to market the feature in alignment with GTM, accurately portray its benefits to customers, and provide users with short, step-by-step training to navigate, use, and implement the feature.
​
Solution: A short, scenario-based microlearning course that helps users understand the existing analytics gaps in the market, comprehend how the feature addresses those gaps, navigate the feature, and implement the feature in their own business cases in order to optimize performance and reduce spend. This course will contain a 4-minute video lesson, a knowledge check, and an NPS & CSAT evaluation. SMEs will provide feedback for the action map, learning objectives, storyboard, visual mockups, and the final product (in 2 rounds).
​
Tools: Vyond, Camtasia, Adobe Illustrator, Lucidchart, Microsoft Word, Microsoft Forms, Skilljar LMS
​
Performance Goal: Drive customer satisfaction for the List Page Analytics feature and training module. Drive product adoption of the new software feature.
​
Audience: Novice to intermediate users of the software, with a moderate level of industry knowledge. More specifically, self-service campaign managers and hands-on-keys internal employees who oversee managed service.
Needs Analysis: Action Mapping
Action Mapping: Met with 2 primary stakeholders, who are also subject matter experts (SMEs)—the Product Manager and the Technical Writer—to determine what the training goals should be. I guided the SMEs to identify the observable actions that a learner will take upon completion of the training. The SMEs identified which actions are essential to measuring user success.
Afterward, I distilled their responses into an action map in order to derive the primary learning objectives for the training.
Feedback: SMEs provided feedback for the action map and LOs, focused on overall business goal alignment, clarity, and intended outcomes of the training.
​
Performance Goal: Drive customer satisfaction for the List Page Analytics feature and training module. Drive product adoption of the new software feature.
​
​
​

Learning Objectives:
-
Comprehend the benefits of using List Page Analytics to improve current and future campaigns
-
Navigate the List Page Analytics time series graph
-
Identify 3 use cases for List Page Analytics
-
Apply information gathered from using List Page Analytics to inform and optimize current campaigns
​
Design: Storyboard (Text)
Storyboard (Text): Once the SMEs have approved the action map, performance goal, and learning objectives, I begin building a blueprint for the training.
The storyboard outlines each sentence on the audio track, supporting visual assets, and graphic design elements in alignment with the objectives. This training will teach the current gaps in analytics for advertisers, the feature's benefits and solutions, and navigation of the feature through a scenario-based, step-by-step demonstration.
​
Feedback: SMEs provided feedback for the storyboard, focused on the accuracy and flow of the training content (both audio, visual, and interactive elements), alignment with marketing voice, and appropriate levels of depth in the content—particularly for nuances in the software.
​
​
​
​

Design: Visual Mockups
Color Schemes & Fonts: Once the SMEs have approved the storyboard, I begin developing visual assets by deciding on the color scheme and font. Since this company was recently acquired, they opted for a neutral color and font scheme that was not brand-specific.
I used Microsoft PowerPoint, Vyond, and Camtasia to create prototypes of the video lesson, along with a text-based sample of the knowledge check.
​
​
​

.png)



Development: eLearning Tools
Tools: Once the visual mockups have been approved by SMEs, I decide on the eLearning authoring tools that will bring the visual assets to life. For this project, I used Canva, Vyond, Microsoft PowerPoint, and Camtasia. I used animation assets from a repository in Vyond. Once a draft of the project is completed, I seek out feedback on the overall training from SMEs, offering guidelines on engagement, flow, and experience.
Features: Custom animations, visual cues, closed captions, scenario-based step-by-step demonstrations in the platform, and text-based annotations. Once the learner completes the course, they're guided to a 2-question knowledge check.
​
Feedback: SMEs provided feedback for the first draft of the course, focused on the accuracy of the end user/audience's flow, demonstration of complex or nuanced concepts/steps, and the presentation of the feature to the market. SMEs also review the knowledge check. Once I make revisions to the course, SMEs review the course again to ensure the training module is ready to be delivered. ​
​
​
​
​
Implementation: Skilljar LMS
SCORM & Learning Management System: Once I gather SME feedback and make final revisions to the course, I export it as a SCORM file and upload it into a SCORM-compliant Learning Management System. This project was uploaded into Skilljar as a SCORM 1.2 file. I collaborate with the company's corporate marketing stakeholders to market the new course in alignment with marketing methodologies and GTM strategy.
Once the learner completes the course, they're guided to a 3-question knowledge check. The knowledge check was built in Skilljar using the quiz feature. Questions include both positive and negative feedback depending on the learner's response, along with cues to try answering failed questions again. The questions scaffold in order of the learning objectives.
​
​
​

Evaluation & Iteration: NPS & CSAT
Course Evaluation: The success of the course is evaluated in 2 ways: the 3-question knowledge check and the course feedback survey, which measures both NPS and CSAT scores. This survey also includes an open-ended response for any user feedback. Recall the performance goal and learning objectives for this training:
​
Performance Goal: Drive customer satisfaction for the List Page Analytics feature and training module. Drive product adoption of the new software feature.
Learning Objectives:
-
Comprehend the benefits of using List Page Analytics to improve current and future campaigns
-
Navigate the List Page Analytics time series graph
-
Identify 3 use cases for List Page Analytics
-
Apply information gathered from using List Page Analytics to inform and optimize current campaigns
Knowledge Check: The knowledge check informs me of the learner's retention of the knowledge and whether or not they are taking the actions we expected them to after taking the course. The results of these quizzes are available on Skilljar, where I can see both individual and aggregate quiz scores.
​
Course Feedback Surveys: Course feedback surveys inform me of customer satisfaction with the learning experience and predict business growth.
I review aggregate quiz scores on Skilljar and the average NPS and CSAT scores on a weekly basis for every course. By examining the overall scores and the trends over time, I'm able to determine if, how, and when a learning experience should be iterated based on these metrics. Since my performance goal for this learning experience is customer satisfaction and product adoption, I can rely on these metrics to assess whether or not the training met the performance goal.
​
​
​
​
​