Print

 

 

 

 

From Interactive Simulation to Scorable Activity: How I Upgraded My Condensation Lab on SLS

One of the things I enjoy most about teaching with the Singapore Student Learning Space (SLS) is how flexible the platform is. You can start with a simple interactive simulation and — with a bit of pedagogical thinking — transform it into a fully scorable activity that gives both students and teachers meaningful feedback.

That’s exactly what I did with my Condensation Interactive Simulation module, and in this post I want to walk through the design process.


The Starting Point: A Pure Interactive Simulation

The journey began with two interactive simulations embedded in the “B. Water Cycle Interactive” section of my module.

Simulation 1 explored how temperature differences between water and its surroundings determine whether condensation forms on the inner or outer surfaces of a beaker. Students used sliders to adjust the surrounding air temperature and the water temperature, then observed where — and whether — water droplets formed.

https://vle.learning.moe.edu.sg/community-gallery/module/preview/0a8a4d0c-a071-42eb-8bcc-46b825d950e6/section/101831311/activity/101831313

 

Simulation 2 — which I added and customised — presented a different scenario: a tilted steel plate placed above a beaker of boiling water. Students could adjust the plate’s temperature and observe how quickly droplets formed and dripped into a collection dish below.

https://vle.learning.moe.edu.sg/community-gallery/module/preview/0a8a4d0c-a071-42eb-8bcc-46b825d950e6/section/101831311/activity/101831312

 

The driving question for this simulation was:

How does the temperature of a surface affect the rate of condensation?

Both simulations were engaging on their own. Students could freely explore, manipulate variables, and observe the results.

But I ran into a familiar challenge: there was no accountability. I had no way of knowing whether students were genuinely making sense of what they were seeing — or simply sliding the controls around and moving on.


The Upgrade: Wrapping the Interactive in a Scorable Shell

This is where the real design work began.

Instead of replacing the interactive, I kept it front and centre and built a scorable question set around it. In effect, the free-exploration activity became a guided, assessable investigation.

Here’s how I structured each upgraded activity.


Step 1: Anchor the Exploration with Structured Instructions

At the top of the activity, I added clear step-by-step instructions. These didn’t replace exploration — they simply gave it direction.

https://vle.learning.moe.edu.sg/community-gallery/module/preview/0a8a4d0c-a071-42eb-8bcc-46b825d950e6/section/101831311/activity/102033826
 
 



For the steel plate simulation, students were guided to:

This simple framing shifts the activity from “playing with sliders” to conducting a small investigation.


Step 2: Add an Open-Ended Observation Question (Q1)

The first question in each scorable activity was an open-ended response worth 4–5 marks.

Students had to capture or describe what they observed in the simulation.

This step was important because it:


Step 3: Layer in Conceptual Multiple-Choice Questions

After the exploration phase, I added several conceptual multiple-choice questions, each with a 30-second timer.

These questions directly connected to the simulation experience, for example:

I also included a multiple-select question to encourage higher-order thinking. Instead of identifying a single correct answer, students had to evaluate several statements about energy transfer and select all that were correct.

For the beaker simulation, the questions similarly tested understanding of:


Step 4: Assign Marks and Add Light Time Pressure

Each question was assigned a mark value (typically 1–5 marks). The multiple-choice questions were given 30-second timers.

This small design choice helps discourage students from searching for answers online and encourages them to rely on their observations from the simulation.


Why This Approach Works

The key insight is simple:

 

The simulation itself didn’t need to change — only its context did.

 

By wrapping the same interactive in structured instructions and a question set, the activity shifted from passive exploration to purposeful inquiry with accountability.

Students still get the satisfaction of manipulating variables and observing real-time outcomes. But now they also have to interpret what they see.

At the same time, teachers gain something valuable: evidence of understanding.

SLS makes this workflow surprisingly smooth. Question types such as Free ResponseMultiple Choice (Single), and Multiple Choice (Multiple) can all sit alongside the embedded simulation on the same activity page.

The Learning Analytics panel also logs simulation interactions, allowing teachers to see timestamped events such as when a student started the simulation and how they interacted with it.


Final Thoughts

If you already have interactive simulations in your SLS modules that students enjoy but aren’t being assessed on, consider trying this approach.

You don’t need to rebuild anything from scratch. Simply:

  1. Add structured step-by-step instructions to guide exploration

  2. Include an open-ended observation question

  3. Follow up with 3–5 conceptual multiple-choice questions

  4. Assign marks and consider adding time limits

The result is a richer and more purposeful learning experience — and a much clearer picture of where your students actually stand.

This activity is part of my Condensation Interactive Simulation module in the SLS Community Gallery. Feel free to explore and adapt it for your own classes.

1 1 1 1 1 1 1 1 1 1 Rating 0.00 (0 Votes)
Category: Student Learning Space
Hits: 13