tiny.cc/joinmooclets allows you to join the Modularity & MOOClets Working Group's mailing list.
You can participate remotely since presenters stream their screen and any discussion over Google Hangouts on Air.
For a new paper on a novel technical definition of a MOOClet, see:
Williams, J. J., Li, N, Kim, J., Whitehill, J., Maldonado, S., Pechenizkiy, M., Chu, L., & Heffernan, N. (Working Paper, under review). MOOClets: A Framework for Improving Online Education through Experimental Comparison and Personalization of Modules. [PDF from SSRN] [Google Doc]
Online courses naturally extend the historical focus on in-person courses. But the future uses of blended & online education may benefit from targeting development at the level of the modular components ("MOOClets" such as videos, lessons, exercises, interactive tools) that make up courses.
Why?
Closer match to learner needs. People can use modular resources to learn a particular concept or skill when a specific need arises or a question has to be answered. (E.g. How to do a t-test.) Online resources reduce reliance on learning from semester-length courses taught by a single instructor to one class, as reflected in work on Just-in-Time, On-Demand, or Subscription Learning.
Leverage Insights & Support Blended Learning in existing educational contexts.The modular nature of MOOClets makes it easier to create them by building on materials from existing educational environments (lesson plans, lectures, exercises, tutoring) which can be used in MOOCs and are then more appropriate as resources to support blended education (flipped classrooms, homework exercises). This facilitates links between recent technology/content development and existing practical needs in K-12, higher education, and the workforce.
Identify Generalizable Design & Pedagogical Principles at the grain size of learning from videos, exercises, conversations, which form the components of most existing (and likely future) online courses, and often also map onto modular units in software/technology platform development.
Iterative Improvement as part of a Self-Improving System. Modules can be more readily re-used and iteratively improved through practical experience, experimental and other kinds of research. Just as Wikipedia's strength was not its first 1000 articles but a system for successive improvement, a development process organized around developing educational modules can facilitate revision & improvement more than a system organized around courses.
Collaborative Development by Researchers, Practitioners, and Entrepreneurs, which is facilitated by reducing the demands of the development process (e.g. hundreds of hours for a MOOC vs. time to develop an example of a lesson or exercise). Bite sized online resources provide a concrete focal point for improvement by "expert crowdsourcing" of diverse practical, scientific, and business perspectives, and can promote collaboration if the conflicts between the goals and constraints of different groups (e.g. Researchers & Entrepreneurs) can be identified rapidly or resolved by creating multiple variations of a MOOClet.
The MOOClets Group is an interdisciplinary set of researchers, practitioners, and entrepreneurs who meet to consider the principles, processes and technology that underlie the creation of modular online educational resources (or "MOOClets"), that are designed to support scientific research, create evidence-based practical resources, and be financially sustainable.
Using rapid-authoring technology to create modules that can be more easily created, revised, and collaboratively edited allows us to jointly meet the goals of:
> Researchers (MOOClets can be easily edited by researchers & used to run experiments & collect data).
> Practitioners (easy-to-use & evidence-based resources for K-12, higher education, and workplace learning).
> Entrepreneurs (identifying those high quality resources that measurably improve learning in a financially valuable way).
A meeting last quarter on Developing Adaptive MOOClets: Allowing tailored feedback, different versions, hyperlinks to extra resources.
This meeting showed how Qualtrics & other Rapid-Authoring Software for E-Learning could be used to quickly create MOOClets/Online Modules (without programming) that were adaptive – the lessons and quizzes presented after a learner completed one question were changed based on their responses.
Principles for producing good adaptive learning were discussed and documented.
We identified ways of adding "adaptability" to current MOOCs or Ed-Tech products, that allowed learners more freedom in adapting their own path through a lesson (e.g. On-Demand information requests using hyperlinks to drop-down text)
Having visitors (from the Open Learning Initiative at Stanford) present on Using MOOClets to communicate principles for instructional/course design
An online video (On Khan Academy or a MOOC platform, tiny.cc/augmentedvideo is an example of how such a video can be easily changed to enhanced learning)
A unit of a course, like a week-length statistics unit explaining how to do a t-test, consisting of videos and exercises (e.g. in a MOOC platform like EdX/Coursera/Udacity, OLI, WISE)
The Codewebs engine developed by Jonathan Huang and Chris Piech, which is a web app that gives real time feedback for student code submitted to a MOOC. The feedback from Codewebs is based completely on the massive collection of submissions by other students. See the Paper.
An online exercise: E.g. Math exercises like those on Khan Academy (tiny.cc/practiceasusual, tiny.cc/whatwhyhow)
A video or text resources intended to motivate students and teach them study skills (e.g. tiny.cc/learningassistant)
An interactive learning coach to help students work through problems (e.g. tiny.cc/kalearningcoachb)
Simulations & Visualizations (e.g. http://tiny.cc/tvlmdx, http://tiny.cc/twlmdx, PHET, WISE)
The MOOClets Working Group is now located at Harvard (HarvardX) and will start meetings for the fall in October. Joining the mailing list (tiny.cc/joinmooclets) will give you notice when it begins.
Remote participation: The Google Hangout link will be posted weekly at tiny.cc/lyticshangout. For those in the MOOClet Group (tiny.cc/joinmooclets), the screencast will be available afterwards at tiny.cc/mooclet.
Tentative/Open to Revision.
Why focus on "MOOClets" rather than MOOCs? The advantages of focusing on modular resources or "MOOClets" rather than MOOCs.
Introductions
Review of MOOClets insights from last quarter
Discussion of Topics & Agenda for this quarter.
Why focus on "MOOClets" rather than MOOCs?
Online courses naturally extend the historical focus on in-person courses. But the future uses of blended & online education may benefit from targeting development at the level of the modular components ("MOOClets" such as videos, lessons, exercises, interactive tools) that make up courses.
The advantages of focusing on modular resources or "MOOClets" rather than MOOCs:
Closer match to learner needs. People can use modular resources to learn a particular concept or skill when a specific need arises or a question has to be answered. (E.g. How to do a t-test.) Online resources reduce reliance on learning from semester-length courses taught by a single instructor to one class, as reflected in work on Just-in-Time, On-Demand, or Subscription Learning.
Leverage Insights & Support Blended Learning in existing educational contexts.The modular nature of MOOClets makes it easier to create them by building on materials from existing educational environments (lesson plans, lectures, exercises, tutoring) which can be used in MOOCs and are then more appropriate as resources to support blended education (flipped classrooms, homework exercises). This facilitates links between recent technology/content development and existing practical needs in K-12, higher education, and the workforce.
Identify Generalizable Design & Pedagogical Principles at the grain size of learning from videos, exercises, conversations, which form the components of most existing (and likely future) online courses, and often also map onto modular units in software/technology platform development.
Iterative Improvement as part of a Self-Improving System. Modules can be more readily re-used and iteratively improved through practical experience, experimental and other kinds of research. Just as Wikipedia's strength was not its first 1000 articles but a system for successive improvement, a development process organized around developing educational modules can facilitate revision & improvement more than a system organized around courses.
Collaborative Development by Researchers, Practitioners, and Entrepreneurs, which is facilitated by reducing the demands of the development process (e.g. hundreds of hours for a MOOC vs. time to develop an example of a lesson or exercise). Bite sized online resources provide a concrete focal point for improvement by "expert crowdsourcing" of diverse practical, scientific, and business perspectives, and can promote collaboration if the conflicts between the goals and constraints of different groups (e.g. Researchers & Entrepreneurs) can be identified rapidly or resolved by creating multiple variations of a MOOClet.
A key advantage of focusing on a MOOClet/modular resource rather than a full MOOC is that it allows researchers and many others to be "course developers". Even if they do not have time to spend hundreds of hours creating an eight week MOOC, they can develop a singe lesson or interactive problem.
This meeting will consider how Qualtrics (a survey software tool freely available to anyone at Stanford) can be used to author online lessons & exercises, and how these can be embedded within EdX MOOCs.
Extending the platform using this additional software tool adds its capacities to those of the MOOC platform, such as:
Rich pedagogical capacities – targeted feedback, adaptive learning, validated responses, control over timing dynamics, sophisticated setting of variables and if-then branching logic based on previous responses and behaviors.
Any data collected about learners is instantly accessible in a research-friendly format.
A/B testing or randomized assignment is possible & more flexible.
Rapid development & use of logic without programming.
Collaborative editing & markup on learning resources.
At the moment we used Qualtrics & OpenEdX, because Qualtrics is extremely well suited to online learning, but this extends to other MOOC platforms. Qualtrics has been linked to OpenEdX, to Coursera, and using LTI with NovoEd provides some functionality.
There are many other Rapid Authoring Tools like Qualtrics, for which a bit of groundwork simply needs to be done to link them to MOOCs.
Demonstration of how to embed Qualtrics into EdX:
Participants in the MOOC were shown different version of a Motivational Welcome Message:
Here is how the message was shown in a MOOC on class.stanford.edu via a Qualtrics survey embedded as iFrame.
You will only see *one* version of the message (other participants will see different versions, up to six different alternatives).
Different versions of Prompts to Answer Reflection Questions:
Qualtrics survey with questions embedded as iFrame below target video
This is a direct link to the Qualtrics survey that is being embedded into the class above. You can go through that survey several times to see all three different versions – no questions at all, questions that have no text entry fields, questions that have text entry fields.
This week has two related goals:
One is to briefly review and continue last week's progress on Rapid Authoring of MOOClets: Embedding Qualtrics into EdX to create interactive MOOClets that extend EdX's pedagogical and data analytics functionality by showing how authoring of pedagogically rich interactive and adaptive exercises and content can be achieved by combining the technological capacities of OpenEdX with those of Qualtrics (embedded via iFrames).
The second is to consider what set of MOOClets/Modules would be appropriate and compelling to actually put up on a platform like EdX (or Coursera/NovoEd). For example, at class.stanford.edu there are a number of full MOOCs/Courses listed.
One idea is to list a MOOC on class.stanford.edu/OpenEdX (or www.coursera.org or www.novoed.com) that is a novel type of "course" in that it is a collection of MOOClets on (for e.g.) the introductory topic of statistical variability and how to conduct a statistical test, along with a few modules on learning and problem-solving strategies for learning statistics (and other topics) in MOOCs.
To be concrete (but very tentative) about what this might look like:
Potential Batch of MOOClets to be provided via a MOOC titled "MOOClets for Introductory Statistics: Understanding variability and the logic of statistical hypothesis tests, and their relevance to real-world problems"
The first key concept here could be on understanding variability: intuitive examples of how understanding it is important for everyday problems, how it is measured and calculated (formally, standard deviation and variance), and what one can do with an understanding of it (which leads to a second concept).
The second key concept could be on the logic of conducting a statistical test: intuitive examples of how one needs to know whether differences between groups (e.g. proportion of students passing an exam after being assigned vs. not assigned to take a supplementary online homework program), how a statistical test (e.g. t-test or chi-square) is calculated using measures of variability/standard deviation, and what one can do with an understanding of what a statistical test is.
Examples of Content that can be used in this Batch of MOOClets:
Content from OLI Statistics course, Khan Academy (and other freely licensed online material) that is relevant to High School, Community College, and Intro University syllabi/standards. [This can be easily implemented as text lessons and exercises in OpenEdX]
Statistics exercises & worked-examples of the kind used on Khan Academy [I have already implemented these in Qualtrics]
A module from Cerego that uses adaptive learning to review foundational statistics terms & concepts (this has been embedded into EdX using LTI).
This content is not intended to but utterly new, but to synthesize the best of what exists in a modular form that can be then further improved thorugh iterative experimentation. These are MOOClets specifically designed to match the criteria we have discussed 1.3.1.1 Why focus on "MOOClets" rather than MOOCs?, such as being in the right form to be directly used by high school, community college, and Stanford University instructors; implemented in technology that makes it easy for us to do randomized experiments and collect data; being easy to update and improve and incorporate rich pedagogy not currently easily available in a MOOC, like adaptive learning and targeted feedback.
What are simple, broadly applicable, and evidence-based principles to apply in creating modular online resources? Although interested in generally applicable principles, we will focus on an example of a lesson and interactive exercise that explain and help people learn the logic behind statistical hypothesis tests. This is the target topic for putting a MOOClet up on an actual MOOC platform, so the ideas pulled together here will be influencing this final product.
The relevant principles are likely overlap significantly with instructional design principles for creating good online courses, classes, and good pedagogy for teaching. But we'll be drawing on all of these and considering it in the context of MOOClets, and specifically, an online video/text lesson on teaching people how to do a statistical test, and/or online exercises to practice and cement this knowledge.
What are examples of experiments that have been embedded in online educational resources? To conduct an experiment in an online context, what kinds of technical information and best practices are useful? This meeting will pull together on people's collective knowledge in considering answers to these questions.
Joseph will also present for feedback (and potential collaboration, if you're interested) a recent draft of a paper on experiments in online environments, along with in-progress results from a collaboration that ran an experiment in a MOOC.
Potentially Relevant Reading/Resources
www.josephjaywilliams.com/experiments-online-education
Williams, J.J. & Williams, B. A. (2013). Using Interventions to Improve Online Learning. Paper presented at the Data Driven Education Workshop at the Conference on Neural Information Processing Systems.
Updated Version: Using Randomized Experiments as a Methodological and Conceptual Tool for improving the Design of Online Learning Environments
Williams, J. J. (2014). How online educational resources provide novel affordances for conducting practical interventions and doing psychology experiments. Stanford Psychological Interventions in Educational Settings (PIES) group, Stanford, CA. [Slides]
Williams, J.J. & Williams, B.A. (2014). Online A/B Tests & Experiments: A Practical But Scientifically Informed Introduction.Course presented at ACM CHI Conference on Human Factors in Computing Systems. Toronto, Canada. [Materials]
Williams, J. J. (2013). Enhancing Educational Research & Practice using Experiments on Online Educational Resources. Talk at Pittsburgh Science of Learning Center LearnLab Summer School, Pittsburgh, PA. [Slides]
Previously we have considered good instructional principles for creating MOOClets from scratch. A related issue is to consider how the modular structure of a MOOClet that has already been created – like an exercise, lesson, video – makes it easier to articulate and test pedagogical principles that can be broadly applied to MOOClets that are similar in the ways that learning from them can be improved, but different in many other features.
For example, considering a given "exercise" as a MOOClet (modular component of a MOOC), learning could be improved by adding motivational messages or prompts with general questions to reflect on (e.g. tiny.cc/whatwhyhow, tiny.cc/whatwhyhow2 on Khan Academy), even if these exercises cover many different topics – algebra, statistics, negotiation strategies, product design.
Similarly, considering a video or lesson as a MOOClet (whether in a MOOC or on-campus blended course; whether a resource for Stanford undergraduates or community college or high school students), learning could be improved by instructions for students to reflect on relevant previous topics (accessing prior knowledge), quizzes during the video, or having students attempt to recall the lesson or explain it to someone else afterwards.
While the exact size of the benefit or nature of the activity might vary – and be customized based on expert knowledge of the specific learning context, the idea is that the modular structure of MOOClets makes it easier to identify such broadly applicable principles when compared to a full course.
An added advantage of focusing on such instructional principles directed at MOOClets is that, as modular components of educational contexts, these are applicable to many contexts. For example, motivational messages and metacognitive prompts to explain in a Khan Academy exercise can be extended (even if requiring adaptation) to the hundreds of exercises on their site that use an identical template. Moreover, on the face of it, these principles may apply far more broadly, since the dynamics of a Khan Academy exercise (see e.g. tiny.cc/whatwhyhow2) align reasonably closely with exercises in a variety of MOOCs, with exercises in intelligent tutoring systems, and even with classroom exercises and homework problems in completely "offline" settings.
We will discuss when and how MOOClets can be designed to support these kinds of broadly applicable pedagogical principles, and what kinds of principles are supported.
This topic is relevant to a Chapter on "Exploring Cognitive Gains" in the iNACOL Handbook of K-12 Blended & Online Learning (feel free to request access, I am happy for suggestions and co-authors).
Relevant Reading could be this paper, which focuses on the MOOClets of a single video and exercise, and draws on research on how prompting people to answer questions and explain in order to identify potential broadly applicable pedagogical principles of asking questions before, during, and after a video/exercise. Demos are shown at tiny.cc/augmentedvideo and tiny.cc/augmentedexercise.
Williams, J.J. (2013). Applying Cognitive Science to Online Learning.Paper presented at the Data Driven Education Workshop at Conference on Neural Information Processing Systems.
Taking last week's theme – that modular components make it easier to articulate broadly applicable instructional principles – a step further, we will discuss how general metacognitive strategies can be promoted with relatively straightforward and readily scalable strategies, like prompting people to rate their confidence in the correctness of answers they give.
Judy Kay (visiting from the University of Syndey) will talk about ""Scaffolding metacognition, starting small but with confidence" and how this could be easily integrated into MOOC platforms current exercises. An abstract and relevant paper to read for background is below:
It is becoming increasingly easy for teachers to create many forms of digital learning resources, such as web-based materials, quizzes and games. Extensive research in metacognition provides strong evidence that metacognitive interface elements could make such materials more effective. This discussion will explore a proposal for a modest start towards bringing metacognitive elements to broader use; replacing the regular submit button with an interface element that enables the learner to report their confident in the task just done. We will share results of previous work that can inform the ways that learners, teachers and course developers gain benefits, then move on to the next steps and links to MOOClets.
Talk: Mykola Pechenizkiy of Eindhoven on "From A/B testing to personalization with uplift classifiers"
Abstract:
MOOC and ITS powered education provides an excellent opportunity for conducting online controlled experiments or A/B testing. Such experimentation allows to find out whether a particular intervention or teaching approach is more effective (on average) than the other in reaching some desired outcome, e.g. maximizing the effectiveness of feedback to students.
However, in many cases a particular intervention may be beneficial for some students but have no effect or have even a negative effect on the performance of other students.
In this talk I will tell how predictive modeling can be used to analyze the results of A/B testing to induce uplift classifiers that can help us to choose which intervention is the most appropriate in which situation.
We formulate three supervised learning approaches to select an appropriate intervention at an individual level.
We emphasize that not all instances (students) are equally sensitive to this choice. Accurate choice of an action is essential for those instances, which are sensitive to this choice. We focus the supervised learning process to such cases. The potential of the underlying ideas is demonstrated with synthetic examples and a case study with real datasets.
Bio:
Mykola Pechenizkiy (PhD) is Assistant Professor in Information Systems at the Department of Computer Science, Eindhoven University of Technology, the Netherlands. He has broad expertise and research interests in knowledge discovery, data mining and predictive analytics, and particularly in their application to various real world problems in industry, commerce, medicine and education. He develops generic frameworks and effective approaches for designing adaptive, context-aware predictive analytics systems dealing with evolving data. In recent years, Mykola has been active in the area of Educational Data Mining (EDM); he co-edited the first Handbook of EDM, co-organized several events including EDM 2011 and LASI 2014, and served asa guest editor of the special issues including the special issue on EDM with ACM SIGKDD Explorations.
Potential Reading:
Zliobaite, I., & Pechenizkiy, M. (2013). Predictive User Modeling with Actionable Attributes. arXiv preprint arXiv:1312.6558.
Topic: Using Modules to bring together A/B Testing, Assessment, Personalization & Adaptive Learning Systems.
A system that allows authoring & modification of different versions of online education modules or MOOClets (crowdsourcing & collaborative development) to be presented randomly (A/B testing) or conditional on features of learners (Personalization) or other content characteristics (Adaptive Learning) deploys modularity to great advantage.
For example, consider a collection of mathematics exercises (like that on www.khanacademy.org). The exact same infrastructure that allows randomized assignment & A/B testing of different versions of exercises can be used to allow conditional logic – showing different versions of an exercise based on measurements of a student's attitudes and learning strategies, a learner's current state of knowledge, or features of content and what has occurred before.
At the same time, the capacity that allows representing multiple versions of a resource (for random assignment or adaptation) also leverages the power of synergistic or adversarial collaboration in creating new resources or improving existing ones, because multiple versions of an exercise or lesson can be created and tested against each other.
Modules that teach students general learning strategies or beliefs that help motivate them have several valuable features and offer particular opportunities for linking research on metacognition and motivation with powerful practical benefits for students.
Practical Benefits
From a cost-benefit analysis perspective, the same time and energy that could be used to teach some specific content (e.g. a geometry principle) might have a far larger effect if it taught a general skill that was then applied to many instances of specific content. For example, if teaching a strategy for how to study worked examples of math problems in order to identify principles increased learning from many subsequently studied problems.
Moreover, MOOClets that teach generally applicable skills can then scale broadly because they can be inserted in a large number of courses of different kinds.
The main obstacle is whether such general strategies are in fact teachable through relatively limited exposure. There are many empirical findings revealing the difficulty and failures in teaching general strategies, particularly divorced from specific content.
On the other hand, a very broad review of literature across different disciplines does reveal examples of successfully teaching domain-general beliefs and strategies that have a lasting effect, which we can discuss in more depth. For example, see "WISE" or High Impact from Brief Exposure Interventions in social psychology. And programs for teaching comprehension strategies like Reciprocal Teaching.
Research Opportunities
Even if randomized experiments to present (different versions of) MOOClets teaching general skills were to reveal relatively small effects, they have the distinctive advantage that these effects can be manifested across a wide range of measures of different learning, and that such effects can continue as a learner moves through materials of many different kinds.
Experimental manipulation of cognitive (rather than metacognitive or motivational) factors like prompting people to explain a concept are expected to impact some very specific knowledge and restricted (although usually carefully calibrated) set of measures of learning.
Experimental manipulation of different version of MOOClets teaching general skills might actually be better suited to research in less controlled environments and where researchers have less sway over which measures of learning are collected. The first reason for this is that such manipulations can impact variables that reflect learning even when these are not tailored to detecting such effects (e.g. accuracy in solving problems, performance on quizzes).
The second reason is that such manipulations can have effects which are pooled across the course of multiple lessons and throughout a course. For example, even if providing lessons on metacognitive skills had a tiny influence on students' success on a particular problem, such effects could be detected in a month's assignments or the final grades in a course.
Talk by Inga Glogger on "Preparing learners to make the most of Online Learning & MOOCs: training and facilitating self-explanation strategies"
From 12-1 Inga Glogger will present a study design aiming to test the transfer effects of a short strategy-training on learning from a MOOC lesson, developed in collaboration with Joseph Jay Williams.
Abstract
When people learn with a learning environment, it is important that they process the given information actively and focus on core information. Especially when learning with MOOCs, where there is little external or social support and many distractions, learners need strategies to focus on and process actively the principles to be learned. Training learners to use such strategies typically requires some facilitation to overcome a phase of enhanced strategy application without improving learning outcomes. The study aims to test the (transfer) effects of a short strategy-training together with such a “procedural” facilitation on learning from MOOCs. The conditions will be a training, a training plus facilitation, and a control condition. The training will focus on principle-based self-explanation and high-quality use of this strategy. The strategy facilitation will provide support for the increasingly self-regulated use of the strategy during learning in a MOOC. Strategy application as well as learning from the MOOC’s contents will be assessed. The study will contribute to the development of training procedures preparing learners to profit more from future online learning.
Bio
Inga Glogger is a postdoctoral researcher and lecturer at the Department of Educational and Developmental Psychology (with Prof. Alexander Renkl) at the University of Freiburg in Germany. Her research focuses on training and assessing process-oriented learning strategies (e.g., by learning journals) and instructional design that aims to educate teachers in assessing learning strategies.
She is also interested in instructional methods that prepare students to learn (prior knowledge is largely missing), such as inventing with contrasting cases or self-explaining worked examples, and that attend to special prior knowledge (prior knowledge is fragmented or incorrect).
Potential Reading
J. Kay, S. Kleitman, and R. Azevedo. Chapter 11. Empowering teachers to design learning resources with metacognitive interface elements. In R. Luckin, J. Underwood, N. Winters, P. Goodyear, B. Grabowski, and S. Puntambeker, editors, Handbook of Design in Educational Technology, 124-134. Taylor and Francis, 2013. [PDF]
Background Research Literature on Motivation & Mindset
We will consider some of the empirical research that has documented substantial benefits towards hard-to-change real-world outcomes like grades (see Mindset: Teach a growth mindset of intelligence to boost motivation and learning, and "WISE" or High Impact from Brief Exposure Interventions) to outline principles for developing MOOClets to increase motivation – Introductory Lectures, Email Announcements, Lessons, Messages, Forms of Feedback, Instructor Guides. We will also examine several existing examples of these resources, and pool knowledge to provide feedback on these or to create new ones.
Examples of Resources: Lessons, Instructor Guides, Digital Interactive Cognitive Aids
Examples of these resources are available in tiny.cc/moocletsnotes to people in the MOOClets Group (tiny.cc/joinmooclets), such as Lessons (Animated 3-4 minute Videos explaining that intelligence is malleable, a video by a MOOC instructor explicitly explaining what a Growth Mindset is, blog post, lesson & activities augmented with Cognitive Support to promote learning of a Growth Mindset); Interactive Digital Coaches to reshape attributions; and a (preliminary) Instructor Worksheet for fostering Growth Mindset
Practical Context: Course(s) on Logic
We will have guests Dave Barker-Plummer and Su Su, since the practical context we will focus on will be for Dave Barker Plummer, Jon Barwise, John Etchemendy and colleagues' upcoming MOOC on Logic, which is based on their Language, Proof and Logic text. Some of this is pretty advanced material, but part of it forms the basis for many introductory courses in Logic (in Philosophy, Mathematics, Computer Science, general undergraduate breadth requirements), you can see the Table of Contents or Textbook. It covers topics like variables, conditionals, truth tables. A related course is taught by Keith Devlin as Introduction to Mathematical Thinking and there is an Introduction to Logic MOOC on Coursera.
Logic is a nice topic to think about because it's also generally applicable to anyone and any domain in terms of critical thinking skills and general reasoning, and "logical reasoning" is an extensive topic of study in cognitive psychology. On the motivational side, it's an area about which students may hold beliefs about not being good at math or computer science or hard sciences.
Incorporating Assessments (e.g., of Learning, Characteristics of Learners) into MOOClets, and exploring how to make modules that function as assessments & quizzes.
Using MOOClets for doing Machine Learning Research
Applying & Illustrating Instructional Design processes using MOOClets
Synthesizing & Leveraging Scientific Research with Practical Implications
[Ethnographic Methods]
Scientifically Informed Benchmarking
Identifying Relevant Educational Products well suited to conducting research.
Creating resources that also influence Macro-Level Real-World outcomes Education Policy Makers & Economics Researchers care about and investigate (e.g., completion/dropout rates, level of education, employment, salary)
Doing research with Scientific & Financial Value: Identifying the contexts in which scientific knowledge has financial value. Where do the interest of entrepreneurs and for-profit businesses align with those of scientific researchers, and where do they diverge?
Treating MOOClet development as a Product Design problem and applying “D-School Design Thinking”
Week 6:
Crowdsourcing between Experts – facilitating Collaborative Research and Diverse Sources of Expertise in making Practical Improvements
You must use a Google Account this is a member of the mailing list and Google Group for MOOClets (mooclet@googlegroup.com) to access any of the Google Documents embedded in this page. You can join or request access for the Google Account you are signed in with now at tiny.cc/joinmooclets.
You must use a Google Account this is a member of the mailing list and Google Group for MOOClets (mooclet@googlegroup.com) to access any of the Google Documents embedded in this page. You can join or request access for the Google Account you are signed in with now at tiny.cc/joinmooclets.
You must use a Google Account this is a member of the mailing list and Google Group for MOOClets (mooclet@googlegroup.com) to see the announcements below. You can join or request access for the Google Account you are signed in with now at tiny.cc/joinmooclets.