Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Should the site reward completed projects? #204

Open
dalelane opened this issue Jul 11, 2019 · 6 comments
Open

Should the site reward completed projects? #204

dalelane opened this issue Jul 11, 2019 · 6 comments

Comments

@dalelane
Copy link
Member

Digital badges are an increasingly popular way in the business world to recognize time that people invest in education. Would it be useful to support something like this for students with Machine Learning for Kids?

There are some questions that would need answering around how and when the site determines that a project has been completed. E.g. I can imagine that you could set some basic thresholds like minimum amount of training data, confirm a ML model was trained, verify that it was used (at least once?) from Scratch and/or Python.

There are other challenges, too. At the moment, students are anonymous - I have a real-world identity (well, an email address) for teachers. But I don't have any way of identifying a student (and that was intentional/by design). So any approach would need to be compatible with that - e.g. would the teacher be the one to request the badge on behalf of their student?

@dalelane
Copy link
Member Author

cc @ajdaniel for his opinions :)

@ajdaniel
Copy link
Contributor

I love this idea! A great way to encourage somebody to use your product effectively is to gamify it, for example: achievements! We could have a list of achievements (each with a badge reward) that all kids can see up front and work their way through. This makes it a fun way to encourage kids to touch all part of the site. E.g. achievements for training a project, adding a lot of examples to a bucket, for completing so many projects, for sharing a project etc.

@ajdaniel
Copy link
Contributor

The hardest part would be how the achievements get triggered. We could automate a lot of it, but not on the project side. Unless we can assume some stuff. Perhaps because we know an API is called to the model from Scratch, we can assume there is a project that uses it. If we can track which of the buckets were in the response, we can see if the student has tried all of the options. Additionally, we could extend the Scratch ML code block to provide more info about the project, so we can track more stuff (such as project name, which other blocks exist)

@dalelane
Copy link
Member Author

Yeah, that should be do-able. The Scratch blocks already include headers to identify themselves (e.g. https://github.com/IBM/taxinomitis/blob/master/resources/scratch3-images-classify.js#L129 ). That was mostly to make it easier to investigate errors (to tell the difference between something that could be a bug in my Scratch extension, compared with a student who's made a mistake in their Python code - given that they both call the same APIs), but we could extend this idea to do enough tracking to enable assessing projects.

If there is any info we'd want to collect that felt too creepy to collect implicitly/automatically/silently, we could also introduce an explicit submit-project-info-for-badge-related-assessing block that would need to be included in a Scratch project?

@ajdaniel
Copy link
Contributor

Yeah that's a good idea. Scratch projects are JSON right? We could do some rudimentary parsing and search for some blocks in there

@gabrielcbe
Copy link
Contributor

Having a discussion with my boss today i remembered this issue, so I'm dropping to maybe help.
I love the idea of gamification as a tool to engage students in their projects.

I haven't used readyai yet, but I saw they have a "passport" and a few badges that students (and teachers) can earn through use of their website and when they make AI projects.

So maybe there could be some achievements for using ML4K and not just for the scratch projects. (e.g. training models for text, image, numbers, and sound, re-training a models to improve it, having a whole-class project, etc.)

As for the question about JSON in Scratch projects, sb2 (probably .sb3 too) projects can be renamed as .zip files and have a JSON extracted from them and parsed at will.

There's a tool called Dr.Scratch that does something similar to this, they ask for Scratch files (or links) and run an evaluation of the code in terms of use of Computational Thinking competences used in the project (probably parsing and looking for stuff in the JSON file).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants