I thought this session deserved its own blog entry :) Definitely an interesting topic and something worthy of discussion. At the moment, unless I am convinced otherwise, I plan to use Scratch for my Pre-AP class at SciTech.
Greenfoot
- higher learning curve vs scratch, alice
- can do much more because you are using java
- age: 14+
- scales as large as Java
- only limited by output (2d graphics)
- open source, Smalltalk
Scratch
- doesn't have classes
- ages 8-15
- over 900 uploaded projects
- concerns about plagarism, from website
Alice
- originally intended for college
- age: 5th grade
- open source and available for developers!
At the end of it all, each platform has a different target market. I think that Greenfoot is better for high school students, in a quarter or semester long course. So, YES - I shifted and plan to explore Greenfoot more. I think that my students will find Greenfoot more of a real and challenging programming language. Wish we would of talked about more of the specific differences between the different platforms. Looking forward to playing around with Greenfoot soon.
Showing posts with label novice programming. Show all posts
Showing posts with label novice programming. Show all posts
Thursday, March 11, 2010
0 Comparing Alice, Greenfoot and Scratch
Posted by
Tonya G
at
3/11/2010 04:47:00 PM
.
Under
alice,
greenfoot,
novice programming,
scratch,
sigcse,
sigcse2010
.
0
comments


0 SIGCSE - PM Sessions - Concept Inventories, Web Services for CS1
Developing a Validated Assesment of Fundamental CS1 Concepts
Allison Elliott Tew and Mark Guzdial, Georgia Tech (allison@cc.gatech.edu)
A Web Services approach to CS1
Illinois State University
Student Misconceptions of Programming
Kaczmarcyk, Petrick, East, Herman (Univ of San Diego, Northern Iowa, Illinois at Urbana)
Allison Elliott Tew and Mark Guzdial, Georgia Tech (allison@cc.gatech.edu)
- Educators need to be able to measure student learning.
- Related studies - McCracken, et al 2001 (explored students basic programming abilities using simple calculator program); Lister, et al, 2004 (Assessed student code comprehension and tacing ability using multiple choice questions about arrays & iteration); Tew, et al, 2005 (Investigated impact of different CS1 courses on students' conceptual knowledge at the end of the course using MCQs)
- Content validity - are the topics a reasonable operationalization of the area
- Construct validity - accurately measuring knowledge of a topic.
- We currently do not have the tools to assess CS1 concepts. Why? We are a young field.
- Method for developing an assessment - define content, expert review of test specification, build test bank of questions, pilot questions, establish validity, establish reliability.
- (From above bullet) We need to add step 4: Verify language independence
- Analyzing widely adopted textbooks, proved unsuccessful. Textbooks covered a greater range of topics than what is covered in a CS1 course.
- Fundamentals (vars, assignment .... )
- Logical operators
- Selection statment
- Definite loops (for)
- Indefinite loops (while)
- Arrays
- Function/method parameters
- Function/method return values
- Recursion
- Object oriented basics
- 3 types of questions for each concept - definition, tracing, code completion.
- two versions of each question
A Web Services approach to CS1
Illinois State University
- Statistically significant improvement in final exam performance and course grade; but grading bias and other concerns.
- Hoping to get industry support
- Where do you find these web services? Web services often disappear. You don't have control.
- They are building a library of their own web services.
Student Misconceptions of Programming
Kaczmarcyk, Petrick, East, Herman (Univ of San Diego, Northern Iowa, Illinois at Urbana)
- focuses on concepts that are important and difficult
- concrete & immediately actionable
- not comprehensive - short!
- 32 important & difficult concepts in CS1 - Goldman et al. in press; Goldman et al. 2008
- Their goal was to confirm (or not) Delphi results
- Identify widely held misconceptions
- Used modified think-aloud interviews, problems both code & non code based
- 89 consisten, reliably repeated misconceptions: memory usage & allocation, array construction, primitive confusion
- Validated (with 2 exceptions) the Delphi results - basic loop operations stump many students, poor to NO conception of objects; NO conception of inheritance
- no conception == no misconceptions
- many of the problems were caused because the students applied a real world concept to a problem. "all of the meats will be in the meats array, and all of the cheese will automatically be in the cheese array." WRONG.
Posted by
Tonya G
at
3/11/2010 02:44:00 PM
.
Under
concept_inventories,
novice programming,
sigcse,
sigcse2010
.
0
comments


Saturday, March 7, 2009
0 Analyzing Programming Projects
Stuart Hanson, University of Wisconsin - Parkside
"This paper presents the results of student surveys administered after each programming project for multiple sections of two courses: CS2, and Data Structures and Algorithms." They "analyze the data in terms of engagement, frustration and niftiness (from abstract)."
The survey was administered on the due date of each assignment, 4 semesters of data for CS1 and CS2, 3 semesters of data of Data Structures and Algorithms and CS0 data from another institution. Class sizes varied from a low of 6 to a high of 30.
Engagement and frustration
positive correlation in CS2, small negative correlation between engagement and frustration means (the more challenge, the less frustration) and no clear correlation for this difference.
Niftiness = engagement - frustration
- distinguishes nicely among projects with large differences in two values.
- flawed in that it doesn't distinguish among projects where engagement and frustration are approximately equal.
Worst assignments
- all were "borrowed"
- all had instructor related problems
Refining assignments work
"This paper presents the results of student surveys administered after each programming project for multiple sections of two courses: CS2, and Data Structures and Algorithms." They "analyze the data in terms of engagement, frustration and niftiness (from abstract)."
The survey was administered on the due date of each assignment, 4 semesters of data for CS1 and CS2, 3 semesters of data of Data Structures and Algorithms and CS0 data from another institution. Class sizes varied from a low of 6 to a high of 30.
Engagement and frustration
positive correlation in CS2, small negative correlation between engagement and frustration means (the more challenge, the less frustration) and no clear correlation for this difference.
Niftiness = engagement - frustration
- distinguishes nicely among projects with large differences in two values.
- flawed in that it doesn't distinguish among projects where engagement and frustration are approximately equal.
Worst assignments
- all were "borrowed"
- all had instructor related problems
- bad data set
- major writing component that the instructor did not adequately discuss.
- assumed (java i/o) background that students did not have.
Refining assignments work
- worst assignments were all first timers
- best assignments were all old timers
Friday, March 6, 2009
0 Reflecting on Programming
Using Programming to Help Students Understand the Value of Diversity
Michael Wick and Paul J Wagner, University of Wisconsin
"At a predominately-white undergraduate university, how can we instill in students an appreciation for the value of diversity and do so in a way that encourages students to seek inclusiveness?"
From my experience, diversity, depending on what factors you consider can be found or non existent in most computing departments. We are all aware women, Latino/as and African Americans are severely underrepresented in computing fields, but in terms of cultural diversity some may argue that it can be found in computing (in comparison to other departments). In the case of my department, Americans are a minority? Can we ever (expect to) truly achieve diversity, on all levels?
Administered and assessed in an intro to programming in C++ course. The basic idea is to solve optimization problems using a genetic algorithm. They administered a pre and post test to participants.
The project: trivial application - maximize the sum of n numbers using a genetic algorithm.
Ummm...what do you think about teaching genetic algorithms to CS1 students? Don't they already have a steep enough learning curve with programming?
Results
Michael Wick and Paul J Wagner, University of Wisconsin
"At a predominately-white undergraduate university, how can we instill in students an appreciation for the value of diversity and do so in a way that encourages students to seek inclusiveness?"
From my experience, diversity, depending on what factors you consider can be found or non existent in most computing departments. We are all aware women, Latino/as and African Americans are severely underrepresented in computing fields, but in terms of cultural diversity some may argue that it can be found in computing (in comparison to other departments). In the case of my department, Americans are a minority? Can we ever (expect to) truly achieve diversity, on all levels?
Administered and assessed in an intro to programming in C++ course. The basic idea is to solve optimization problems using a genetic algorithm. They administered a pre and post test to participants.
The project: trivial application - maximize the sum of n numbers using a genetic algorithm.
Ummm...what do you think about teaching genetic algorithms to CS1 students? Don't they already have a steep enough learning curve with programming?
Results
- no control group available
- multitude of uncontrolled factors
- ...
0 Behaviors of novice programmers
I always find it interesting to learn about novice programming. Where do their misconceptions lie? What is the best approach for teaching novice programmers? This session "reports on an NSF funded research project investigating the development practices of students in introductory programming courses." The authors use an extension of BlueJ to "capture events associated with program development (from abstract)." Over 55,000 compilation events from over 110 students. The data is collected through ClockIt - a web-based interface used to view the data (programming behavior of students) in the form of charts, graphs and tables. With ClockIt the instructor (or TA) can view statistics per student, such as the number of compiles, types of errors and how long they work on the project.
Authors: James B. Fenwick Jr., Cindy Norris, Frank Barry, Josh Rountree, Cole Spicer and Scott Cheek (Appalachian State University).
Frequency of compile errors
The results of this study almost replicate Judad's study (2005).
Macro Behavior Views (quantitative evidence how programming behavior vs success)
They are looking for more people to use the data loggers ....
Authors: James B. Fenwick Jr., Cindy Norris, Frank Barry, Josh Rountree, Cole Spicer and Scott Cheek (Appalachian State University).
Frequency of compile errors
- 6 errors were 60% of all errors (similar to other research ie. Judad)
- missing semicolon (#3)
- unknown variable (#1)
- Bracket expected (#4)
- Illegal start of expression (?)
The results of this study almost replicate Judad's study (2005).
Macro Behavior Views (quantitative evidence how programming behavior vs success)
- Can we track the success of the student based upon when they started on the assignment? (start date vs. success/grade)
- Incremental work vs. success/grade - if students work on the project incremental, then they will get a better grade.
- Amount of time vs grade - students that spend more time on the assignment do much better; but students who do poor actually commit a lot of time to the assignment.
- Can we measure the effort? Event density vs. Grade - There are students that are not succeeding that are really trying - A average and F average have similar event density. B-average students have higher event density than all others.
- What makes a student succeed?
They are looking for more people to use the data loggers ....
Subscribe to:
Posts (Atom)