Learning Styles and DISC Profile

LEARNING STYLE

We took tests in IDD620 to determine learning styles. I tested very high for visual and auditory but very low for kinesthetic.

Visual and Auditory Learning: I’ve always known that I am a visual learner. I prefer visual to auditory, so I wouldn’t expect to score so high in auditory. I see now that I do my best learning when I am doing both together, but I never deliberately sought to use both because they are strengths. I tend to enjoy lecture, if I am able to take notes. I have always liked outlining and drawing graphic organizers (though I never knew they were called that until this class). My favorite way of studying is by myself, reorganizing content into my own study guides. Second best, for my learning, I like teaching someone else, like in a group setting.

Kinesthetic Learning: I guess this means I shouldn’t try to walk and chew gum at the same time. In elementary school, I hated doing crafts and making dioramas.  In high school, I hated chemistry lab and home economics class where I had to sew (yes, I am THAT old). Now I know why. I didn’t mind biology lab, probably because of the visual aspect.

DISC PROFILE

I think my DISC results really capture the way I see myself at work. I was surprised by how much the results resonated with me. I started with the free test, which described me this way:

You have an inner motivation to gain knowledge and become ‘the expert’. You have the self-discipline to focus and you aim for high standards. You appear to be relaxed and are likely to have plenty to talk about. People see you as knowledgeable, non-threatening and easy to get along with.

Then I was interested enough to buy access to the full test results. They showed that, for Dominance, I score high in composed/reserved and low in direct/competitive. I agree with this because I like to compete, but mostly with myself rather than others. For Influence, I scored low in factual/analytical and high in social/outgoing, which surprised me because I usually test as an introvert. I think this means that I enjoy being outgoing when I am interested in something, but not just for the sake of being social. For Steadiness, I scored high is impulsive/changeful and low in consistent/thoughtful, which is a little surprising to me because in Compliance I scored low in independent/uninhibited and high in conventional/reliable. I do see both characteristics (impulsiveness and reliability) in the way I behave; these seem to be opposites, but somehow both feel right to me.

For my DISC profile, I was categorized as The Evaluator saying:

Your prime value to an organization is: Your ability to work with the team and make things happen. Nine times out of ten your plan will work.

I’m not sure if this is true, but I know that I like to formulate plans that other people accept and that succeed. The success of a plan gives me a better feeling than being recognized as the person who created the plan.

I like to think that my personality and preferences will make me a good team member who is flexible and more interested in doing great work than in being “right” all the time. That doesn’t mean I like to be wrong! But I want to change if I am wrong, because I am always interested in getting better, being more effective. I love the marriage of opposites in IDD: creativity and structure. I think this is another reason IDD appeals to me—because I have similar preferences in impulsivity and reliability.

SaveSave

Work Process: Designing a Game

THE GAME

I designed a board game called Super (Quiz) Bowl. It is a game combining trivia questions with the rules of football, where the game board is a football field with an “end zone” at either end, which must be crossed to score a touchdown. I made a game board out of foam core board, but you could use a sheet of paper, if you marked it at 10 yard-intervals.

This game requires the football field (described above), four dice, two timers (showing seconds—we used the stopwatch function on our phones), Play cards (I made these on Canva and printed them out at home), Question cards (I used brain-teasers and trivia from the internet), and a football game piece (you could use a paperclip here or some other everyday object).

Ideally, 3-9 people may play. This game requires two teams, of 1-4 players each, which alternate between being “Offense” and “Defense,” and an “Official.” If there is an even number of players, one person on each team must alternate acting as the Official.

The object of the game is to advance down the field toward the other team’s end zone (where a roll of dice determines the degree of advancement for each turn or “down”) and to score the most points (by touchdowns or kicks), while answering questions and overcoming the attacks of the opposite team.

The Offense (1) rolls dice to determine the possible yardage (2) draws a “Play card”, which determines the response of the Defense during the play (3) answers a question (a) brain teaser or (b) trivia.

The Defense has chances to compete during the 30 seconds allotted for the Offense to answer the question. The Defense may get a “turn over” for besting the offense during the question time, which allows them to get possession of the ball and begin competing as the Offense. The Offense can score by touchdown or kick.

The Official keeps track of downs, reads questions, judges answers, and is both time and score-keeper.

Game is over when one team reaches 35 points or 20 minutes has expired (there is a half-time break after 10 minutes of play).

THE PROCESS

Here are the iterations of my game:

Initial Concept (Brainstorming):

My main question: what are some activities where people lose track of time, while having fun?

Answer: Football!

Second question: How can I combine traditional football rules with board game play?

Answer: Use quiz-bowl type questions to determine progress of offense

Third question: Where would I find the kind of questions that anyone could answer, but would take some time, some deliberation, to add more drama during the play?

Answer: this is a weakness. I had trouble finding questions that I could use for my Beta testing. I didn’t have time to write an entire question bank. I found some websites that offered various trivia and quiz bowl type questions, but none were completely satisfactory.

Then I sketched out the elements of the game:

Roll 4 dice to determine possible yardages (rolling a pair doubles the face value of the roll, rolling a triple, triples the yardage, rolling a quad allows you to go for a touchdown right away)

Draw a “Play” card to determine the play (really, this is the response of defense during play)

Run (Defense gets to distract offense during answer time)

Pass (toss-up question, ring a bell to “win” the chance to answer)

Punt (Defense gets to answer first)

Kick (Defense gets no chance, Offense only answers)

Answer questions – timed for 10 seconds

I wrote the directions in a word document and drew up a game board and cards

The Amys’ Review

I shared the game with my friend Amy and a friend of hers (also named Amy!). They loved it, but had some confusion about:

The yardage determination with the dice

The “Run” card (this allows the defense to distract the offense during play to prevent them from answering the questions)

If each team works together, or if the individuals answer independently

The problem of the question bank was still an issue

Linda’s Review

I revised and clarified my directions and shared them with my friend Linda. We played a few rounds and she gave me some suggestions:

Rolling the dice – getting to double and triple yardage makes the number too high. Instead:

For a double, add 5 yards.

For a triple, add 10 yards

Explain the purpose of the Play cards sooner in the directions (this part was referred to early in the directions but not explained in detail till the end).

When drawing a “Pass” Play card, instead of ringing a bell to win the chance to answer, place a household object in the middle of the game board. The team that is ready first will pick it up. This was helpful in making the game for easier to replicate at home.

Increase the time allotted to answer each question from 10 seconds to 30 seconds.

The question bank was still a problem, but the game worked well. It was fun (when finding a question wasn’t a problem).

Family

I made the corrections from my time with Linda and sat down to play with my family. There were two kids and two adults.

Changes made with Linda’s help were good. Everyone thought the game was fun and had a lot of potential.

Once again, the biggest trouble was the question bank. Some questions were too easy and went too quickly. Some questions were too hard.

Overall Effectiveness/Usefulness

Effectiveness of the game:

Games are supposed to be fun. This was fun!

Games are unique in the way you tend to lose track of time when you are playing, and this was true for this game.

Drawbacks:

It wasn’t good for the game that the difficulty level of the questions was inconsistent because it affected the sense of fairness. With the right question bank, I think this game would have excellent potential in the commercial or educational markets.

You have probably already figured this out, but this game is complicated. More like a board game you would buy. It takes one time of playing it to figure it out, but then it is really fun. I think that this would be kind of hard to do with homemade objects because of the need for the Play cards (see the picture below).

Value for learning/Usefulness in educational or training setting

This game is not dependent on a particular genre of trivia, so any category of questions could be inserted into the structure of the game, and it would still work (provided the questions were written for the skill and age of the players).

Therefore, this would be an excellent game for review of concepts, for use with small groups, in classrooms for kids ages 10-18. And also, there could be some usefulness in higher education and training for review of concepts or skills, but might be more popular with adults in the board game market.

Ultimately, the greatest challenging was writing appropriate questions. This was the biggest problem with this game, but building a bank of questions is a project unto itself and so I concentrated on the mechanics of the game, which worked very well (once we got the kinks out).

What’s an Educational Wiki?

Education wikis are used in both traditional and online classes from elementary school through graduate school and into the workplace. Wikis are used for group or collaborative authoring, building courseware, developing and documenting work on papers or research projects for peer review, tracking and streamlining group projects, reviewing classes and teachers, and building critical skills for similar application in the workplace (Robinson, M., 2006, p. 108; Duffy, Peter and Bruns, Axel, 2006).

The individual and collaborative work developed using Wikis is transforming how people learn in many ways. For example, wikis are contributing to the shift from instructor-centered teaching to student-centered learning (Bold, 2006, p. 12). In this way, the use of the technological platform is a reflection of the theories of Constructivism (knowledge is developed internally, by learners, as they encounter and solve real world problems) and Social Constructivism where collaboration is integral for the group construction of knowledge (Reiser & Dempsey, 2018, pp. 72-73). Brown explains “knowledge has two dimensions, the explicit and tacit. The explicit dimension deals with concepts…[whereas] tacit knowledge is best displayed in terms of performance and skills” (Brown, J. S., 2010, p. 15). Both explicit and tacit knowledge increase when learners collaborate in a constructivist “community of practice,” dealing with real problems. (Brown, J. S., 2010, p. 15). The real world applications serve to connect knowledge to situations where the purpose is clear—this is both Situated Cognition and Anchored Instruction (Reiser & Dempsey, 2018, p. 70).

Educational use of online sharing platforms is actually changing both learning and technology. In the past, technology was used to facilitate learning. Today, new technologies, like Wikis, continue to facilitate learning activities, but now the learning activities are also changing the face of technology (Reiser & Dempsey, 2018, p. 69). This dynamic relationship between technology and learning points to Pea’s concept of distributed knowledge. He explains that learners develop intelligence when “interacting with [cognitive tools] distributed across minds, persons, and the symbolic and physical environments, both natural and artificial” (1993, p. 47-48). Pea defines cognitive tools as any practice or medium (including the use of computer, online, and social technologies) “that helps transcend the limitations of the mind, such as memory, in activities of thinking, learning, and problem solving” (Gebre, E., Saroyan, A., & Bracewell, R., 2014, p. 9). Cognitive tools like wikis create the opportunity to transcend traditional educational limitations by “allowing learners to externalize their internal representations” and to participate in the construction of both technology and learning (Gebre, E., Saroyan, A., & Bracewell, R., 2014, p.10).

Classmates, have you realized you are participating in the growth and change of both knowledge and technology? I hadn’t really considered this until now. What do you think about the idea of distributed intelligence—that anything in your environment can be a cognitive tool to grow intelligence?

 

References

Bold, M. (2006) Use of Wikis in Graduate Course Work Journal of Interactive Learning Research, 17(1), 5-14. Retrieved from: https://www.learntechlib.org/d/6033/ (Links to an external site.)Links to an external site.

Brown, J. S. (2010). Growing Up Digital: How the Web Changes Work, Education, and the Ways People Learn. [PDF file] Change: The Magazine of Higher Learning, 32(2), 11-20. Retrieved from: http://www.johnseelybrown.com/Growing_up_digital.pdf (Links to an external site.)Links to an external site.

Duffy, P. & Bruns, A. (2006). The Use of Blogs, Wikis and RSS in Education: A Conversation of Possibilities. [PDF file] In Proceedings Online Learning and Teaching Conference 2006 (pp. 31-38). Brisbane. Retrieved from: https://eprints.qut.edu.au/5398/1/5398.pdf (Links to an external site.)Links to an external site.

Gebre, E., Saroyan, A., and Bracewell, R. (2014). Students’ engagement in technology rich classrooms and its relationship to professors’ conceptions of effective teaching. British Journal of Educational Technology, 45, 83–96. doi:10.1111/bjet.12001 Retrieved from: http://digitool.library.mcgill.ca/thesisfile117094.pdf (Links to an external site.)Links to an external site.

Pea, R. D. (1993). Practices of distributed intelligence and designs for education. [PDF file] In G. Salomon (Ed.), Distributed Cognitions: Psychological and Educational Considerations (pp. 47–87). New York: Cambridge University Press. Retrieved from: https://telearn.archives-ouvertes.fr/file/index/docid/190571/filename/A67_Pea_93_DI_CUP.pdf (Links to an external site.)Links to an external site.

Reiser, R. A., & Dempsey, J. V. (2018). Trends and issues in instructional design and technology. Boston: Pearson Education.

Robinson, M. (2006) Wikis in Education: Social Construction as Learning. Community College Enterprise, 12(2), 107-109. Retrieved from: https://www.questia.com/read/1P3-1167542181/wikis-in-education-social-construction-as-learning (Links to an external site.)

A Comprehensive Evaluation Model

Evaluation is an essential element in the process of designing and implementing educational and training programs. The analysis of a program’s successes or failures allows for improvement in the learning and performance of individuals as well as greater efficiency for the organization (Reiser & Dempsey, 2018, pp. 87, 91). Models should assess both formative and summative evaluation (p.87). With this in mind, what follows is a proposal for a new order for evaluation, which borrows from Stufflebeams’ influential CIPP model, Rossi’s question-based Five-Domain Evaluation Model, and Kirkpatrick’s Training Evaluation, Levels 3 and 4 (pp. 88, 92).

In this, The Comprehensive Evaluation Model, there are three stages: Foundational Considerations, Procedural Concerns, and Outcomes Valuations. The three stages provide opportunities for both formative assessment, which evaluates the process of the program and implements improvements as needed, and summative assessment, which is concerned with evaluation in any area other than development (Reiser & Dempsey, 2018, p. 87).

The first stage of the Comprehensive Evaluation Model is Foundational Considerations, which begins with context evaluation, derived from CIPP and Rossi’s Five-Domains; this is a needs assessment to determine if the program is necessary (Reiser & Dempsey, 2018, p.88). The second step, like CIPP’s Input Evaluation, is concerned with whether the available resources and support are adequate to implement the program (p. 88). And the third and final step considers the concept of the program, like Rossi’s Theory Assessment, analyzing the potential success of the overall program concept in an effort to avoid theory failure (p.88).

The Procedural Concerns portion of this new model deals primarily with formative evaluation where the development of the program and the process of implementation are deliberated, looking for measures that might improve effectiveness (Reiser & Dempsey, 2018, p. 88).

Finally, the Outcomes Valuation is a summative evaluation focusing on the overall success of the product (i.e. the program). Considerations here include an implementation assessment, asking Rossi’s question, “Was the program implemented properly, according to the program plan?” (Reiser & Dempsey, 2018, p. 88). What follows is an impact study, evaluating whether or not the learner behavior was changed, as intended by the program. This evaluation investigates using Kirkpatrick’s Training Evaluation Model, Levels 3 and 4. At this point, evaluators must determine (1) if learners can apply learned concepts to the necessary arena (workplace or classroom) and (2) if the program implemented change that improved the performance of the organization (p.92.). All of these factors must be weighed on the balance with an efficiency assessment, like Rossi’s, to determine the return on investment or the cost effectiveness of the program (pp. 88-89).

Reiser, R. A., & Dempsey, J. V. (2018). Trends and issues in instructional design and technology. Boston: Pearson Education.