I have revised this article as 'Faster, better, cheaper through reusable learning objects'.
My background is in human performance. I studied biology, psychology, sociology and a lot of sports science and pe. So I'm fascinated with finding out what our bodies are capable of. It's amazing how much is known and has been documented. What I see as lacking is a model that ties all this knowledge into virtual systems.
My background is in human performance. I studied biology, psychology, sociology and a lot of sports science and pe. So I'm fascinated with finding out what our bodies are capable of. It's amazing how much is known and has been documented. What I see as lacking is a model that ties all this knowledge into virtual systems.
I feel software and our abilities with it, have advanced sufficiently to make online simulations and models in which you can test ideas and investigate complex processes within the body.
Problem is I can't find any open source work that's already done even the basics. People just don't seem to think about it or believe it's possible. I want to find out which processes have been replicated such as whether vitamins and minerals and their process of decay has been modelled, metabolism, things like that. I've heard lots of useful things that have been modelled that I'd like to use. I want to look at how life is existing at the moment. Human life at first.
The project I'm focusing on at first is focussed on something very simple, easy to deliver, that makes the point that this stuff can be modelled and should be made interactive. I did a course at the ou that had tables showing how quickly vitamins and minerals degrade at certain temperatures and other conditions. From this you can work out how quickly these vitamins degrade in the food you buy and thus figure out the best ways to store and cook them to get the most benefit.
That's great but none of it's digital data that I can query. So it requires a whole lot of work just to figure out what would happen for one item of food. Yet all this stuff is known and pretty standardised. So I want to model this for just one fruit, probably an apple, at first using web technologies.
Phase 1:
Create a simple reusable learning object where you see the apple at the point it's picked to the point it's eaten. At the same time you see the nutrient values at that point in time. So you'll see the nutrient values change over time and get an idea of the best time to eat your fruit. You'll be able to pick any point in the time line to see the values at that point.
Phase 2:
This is where it gets more interactive. Nutrients are affected differently by temperature. I'd add the option to
select various temperatures at points along the timeline to show the effect on the nutrients of storing the apple at different temperatures and also cooking it.
I've got many ideas of how to take this forward but I'm just focusing on these two phases for now. I believe it's now possible to achieve using just html and javascript which means it can then run on all browsers that can understand these technologies. I also just slowly want to tackle basic problems such as these so that over time I've built up enough libraries to begin refactoring into a more cohesive set of code to describe processes for the human body.
The key aspects for me are that this should be an open source project so that any contribution made to it is open to others. e.g if alife contributes to it alife gets the reference. I want it open source to encourage collabroration. I don't think I can do it all myself. I just want to spearhead this and spark interest in others. Much like alife is doing.
I've talked a lot more about my general ideas on my blog. It's a real first draft article at present. Something I'm building over time.
I believe this would be an innovative way of involving people in the lessons they're learning. I grew up interacting with everything. Particularly in computer games. I learn so much more that way. Current methods of learning are mainly about lecturing. Not engaging and collaborating. So I see it as a way of bringing experimentation and personal discovery back into learning so that we can use our full human potential in the virtual classroom. Look at things from different angles and ask questions directly rather than through a teacher. The teacher becomes much more of a facilitator. Some one who can ask us questions that stimulate us to look at the example in different ways and discover new things.
I think if we did that then we'd find out that our bodies aren't so difficult to understand. That by understanding and applying just a few concepts we'd actually eradicate the majority of major diseases such as heart disease and osteoporosis.
There we have it. I've put this idea out there. I'm very excited by it. I've had it in my head for quite a while now. glad to have it down in print.
I've also created a simple stockwatcher example of what's currently possible. I documented how the development process to show how quickly this was achieved.
edit 20100327
It just occurred to me another reason why this approach excites me so much. It enables education to make the next leep in shared educational resources, a market or app store. Yes I know it's the term du jour but the point is not the name it's what it brings. That is all resources in one place under common terms of use. Instead of each teacher or course team having to scour the web for resources and content and having to find terms of use that suit them. Then can instead search one place. Knowing that each app is designed to common standards, I don't mean it all uses the same technology, I mean that each is deliverable over the web and can be either reached directly or embedded as part of a course or suite of tools, in effect it can be white labelled.
To be listed in the app store an app must meet minimum terms such as privacy controls, accessibility standards, browser support, cost. Meeting each standard such as cost or the highest accessibility standards isn't required but each app will be rated according to these standards to enable easy search and classification. Making it easier for those looking for apps to find whaty they need according to their criteria.
I've also created a simple stockwatcher example of what's currently possible. I documented how the development process to show how quickly this was achieved.
edit 20100327
It just occurred to me another reason why this approach excites me so much. It enables education to make the next leep in shared educational resources, a market or app store. Yes I know it's the term du jour but the point is not the name it's what it brings. That is all resources in one place under common terms of use. Instead of each teacher or course team having to scour the web for resources and content and having to find terms of use that suit them. Then can instead search one place. Knowing that each app is designed to common standards, I don't mean it all uses the same technology, I mean that each is deliverable over the web and can be either reached directly or embedded as part of a course or suite of tools, in effect it can be white labelled.
To be listed in the app store an app must meet minimum terms such as privacy controls, accessibility standards, browser support, cost. Meeting each standard such as cost or the highest accessibility standards isn't required but each app will be rated according to these standards to enable easy search and classification. Making it easier for those looking for apps to find whaty they need according to their criteria.
No comments:
Post a Comment