• Get New Articles Sent to You!

  •  
Archive for March, 2011

Building a Corporate University: Resources

By Bryant Nielson, Managing Director On March 22, 2011 NO COMMENTS

Summary: Our next step in building a Corporate University is to determine resources. You must take into account the costs, staff needs, delivery systems, locations, and marketing before going to your executive team for funding.

You’ve taken the time to assess learning needs and possible delivery methods across the board, removing the “nice to know” and taking a hard look at the “need to know”. Now you must consider the resources you’ll need to make it all happen, not only from a cost perspective but also to paint a picture of how big the enterprise is going to be.

When it comes to staff, a Corporate University is a difficult proposition. It’s hard to know how many instructors you’ll need or how many people you’ll need to manage those groups. But it’s

Click here to continue reading


NSA NYC Speaker March 18, 2011

By Bryant Nielson, Managing Director On March 18, 2011 NO COMMENTS

Bryant Nielson -- NSA Speaker March 2011

Technology for Speakers:
Potent Tools for Online and In-Person Presentations and Training

Speakers, trainers, coaches and consultants know that technology can be a asset to their businesses but only if they know what’s available (that changes almost daily!) and how to quickly learn and customize these applications. In this practical and fast-paced program training and technology expert Bryant Nielson will cover how to set yourself apart from the competition by customizing your programs using technologies such as:
  • Webinars as direct training and follow-up vehicles
  • Skype, Adobe Connect, WebEx, GoToMeeting, Oovoo and Fuse for client contact
  • Webcasting (Dyyno, Ustream, Mogulus) for live training online
  • Live engagement, live polling and more.
Special Guest — NSA’s Technology Guru Terry Brock will join Bryant via SKYPE on best practices for speakers using SKYPE!
About Bryant Nielson
Bryant Nielson is the Managing Director for CapitalWave Inc, and FinancialTrainingSolutions (FTS)–the live training division. Previously, he was the Strategic Alliance & Acquisitions Director for (FTS). FTS is a global training firm that specializes in “Delivering Innovative Training Solutions” to the corporate and financial markets. With offices in New York and London, CapitalWave provides programs that cover the major financial markets. CapitalWave’s approach to training is to avoid “off-the-shelf” programs, preferring to work with clients on customizing programs that integrate their business objectives with the programs delivered.
http://www.nsa-nyc.org/article.html?aid=149

Building a Corporate University: Assessment

By Bryant Nielson, Managing Director On March 15, 2011 NO COMMENTS

Summary: Building a Corporate University is a long-term project with many phases. In our next ten-part series, we will look at each phase of the University project, beginning with across the board assessment of learning needs.

Before any solid plans are made for the Corporate University, you have to know the scope of the learning that will be required. Some organizations have multiple business units, each one carrying out a separate purpose but striving to move the organization forward. With this in mind, you must embark on a very thorough assessment of learning needs. Always be mindful of the organization’s overall strategy and mission, which will tie into your assessment, as well.

Click here to continue reading


Simulations Evaluation

By Bryant Nielson, Managing Director On March 7, 2011 NO COMMENTS

Now that you’ve decided to use simulations in your programs, let’s look at some best practices for evaluating simulation results.

A well-designed simulation will only be effective if you are able to evaluate the results – and pass those results on to the participants. As we’ve discussed, immediate feedback is a benefit of simulations, so the evaluation of final outcomes should be fairly immediate so that participants can quickly apply what they’ve learned. Let’s discuss some ways to create simulation evaluations in a way that makes them useful to both the organization and the participants.

The first step in creating effective simulation evaluations is to look closely at the delivery method. Obviously if the simulation is a complex, computer-based operation, then the programming should also deliver an evaluation in an immediate context. For example, a flight simulator will create a plane crash if the pilot has made grave mistakes. Action-based simulations, like putting out a fire or building a piece of furniture should not only be based on the quality of the final outcome but also on the time it took to reach the outcome. If the fire has been extinguished, how much time did it take and how much structural damage was done? Or, if the chair has been assembled, how long did it take and will it collapse when someone sits in it? Case study simulations should be based on the outcomes and, like all of the other simulations, on the consequences of wrong actions. We will examine this in just a moment. Finally, if a group is involved, be sure to evaluate how well the group worked together as well as the contributions of individual members.

For any simulation, whether complex or not, take the time to list the desired outcomes. For example, a financial simulation could have outcome levels, such as cash savings of $100,000, $75,000, and so on. An HR-based simulation could have outcomes of successfully delivering permanent pay cut notices with a minimum of attrition. No matter what the topic of the simulation, the evaluation has to start with the desired outcomes.

Reaching a successful outcome is one aspect of simulation, but participants should also know if they have taken the preferred steps for those outcomes. The preferred steps should coincide with applicable laws, natural phenomenon, organizational procedures, and even organizational culture. For instance, the HR simulation may end with a low attrition number but what happens if the participant tells simulated employees that their pay will rise back to its original point within a few months, when the cut was permanent?

Not only is it necessary to examine the preferred steps for evaluation, it is also necessary to look at the consequences for wrong actions. One way to design this part of the evaluation is through the use of a decision tree that maps out the right steps, the wrong steps, and the consequences. Consequences for wrong steps are a big part of simulation, because they help the participants learn and apply knowledge to the situation. With that in mind, remember to explain consequences in terms that are correlated with the simulation, such as lost dollars, lost time, or potential attrition rates. The ability to compare right steps with wrong steps using the same units is invaluable in application. Along these lines, though, be sure to have moderators point out correct thought processes even if the eventual step is incorrect. This may be especially true in group simulations, where some group members wanted to take the correct step or process.

Finally, create a matrix or rubric that shows the criterion for the evaluation so that it is useful to both the moderator and the participants. For example, if a participant or group chooses a right step but makes errors along the way, their partial credit should reflect this and point out what was correct in their thought processes. In addition, weights in the evaluation should coincide with weights in the real world. In other words, a loss of dollars that causes an organizational bankruptcy should be weighted much heavier than a loss of dollars that barely causes a shudder. Both are wrong, but, as in the real world, sometimes the wrong choices carry degrees of consequence. Keep in mind that choices that are contrary to organizational culture or applicable law should be heavily weighted, as well.

Here is one final tip on evaluation: if the simulation has multiple parts, be sure to create an evaluation for each part.