Tuesday, April 26, 2011

Livescribe Pen


I was excited to play with a Livescribe pen and finally had the time last weekend.  Basically, you write with the Livescribe pen on special paper and records audio and links it to what you have written.  So you can go back in the notebook and click on what you wrote to hear what was being said at that time.  The "sessions" recorded can be uploaded to your computer and then shared online in the Livescribe community as "pencasts." 

Unfortunately, I'm a bit disappointed!  I've had trouble getting it to communicate with my computer.  There is no way to prompt it to  transfer sessions - it's supposed to do that automatically when the USB dock is plugged in.  It's also supposed to automatically open Livescribe desktop when I plug in the pen but it doesn't.  I will keep working on that.  I have been meaning to contact other users on campus and find out if they've had this problem.  So, I'm not conclusive it is a problem at this point - it may be me! 

Some faculty are using the pencasts as a teaching tool.  I agree that the Livescribe pen is well used for things that are difficult to type and it's comfortable to write with a pen on paper.  For instance, I made a horribly drawn pencast explaining the landscaping in my yard.  If I could draw well, this would be the easiest way to explain that to someone else.

From what I understand, the initial purpose of the Livescribe pen was to capture meeting or lecture notes and the pencasts reflect this use: they show the whole notebook page right away, with everything you have written on it in grey.  As it plays and you draw/write something, that part becomes darker.  So by the end, everything on the page is darker.

Being able to look ahead is nice if you are already familiar with the material and you are using it as a review.  As a learner, I dislike this because I was looking ahead while trying to pay attention to what was going on at the moment and found it a bit much. I am linking my perception to the cognitive load theory which indicates hearing text read out loud and seeing it at the same time overloads a learner and actually inhibits learning.  This is because we can scan with our eyes faster than we can read out loud, so we have a tendency to scan ahead of what is being read if possible.  I think that looking at ahead at the pencast and trying to listen to what is going on currently works the same way. 

Here's a quick revelation I had (bear with me, it takes a few sentences to get there).  I admit that I frequently tend to only half listen to most things around me.  I am not a verbal person.  So if I really want to make myself pay attention to something on television or online, I will turn the captions on if they are available.  When I first learned about the cognitive load theory, I thought that my seemingly successful use of captions actually contradicted it.  However, then I realized that the captions are presented in small chunks and you really can't get too far ahead of what is going on, if they are timely. So I think that in the situation of captions, hearing and seeing can reinforce the concept because of the timing.  The problem lies in situations where you can read, for instance, a whole page of text but you are also hearing it read out loud at the same time.  In that situation, a choice should be made between one or the other.  I have experience this when my mother-in-law (bless her heart) tries to read something out loud that we are both looking at in print.  Her extremely slow reading makes it harder for me to read it and understand myself. 

Some other disadvantages I see with the Livescribe pen so far is that it exports in a proprietary format (.pencast) so viewers need to install Livescribe desktop to view the pencasts unless they are uploaded to the Livescribe site.  Although the Livescribe desktop program is free, it is an additional step and there is a limited amount of space available to each user on the Livescribe site.  The pencasts do not seem to be editable or (gasp!) captionable for students with disabilities.  I will continue to play with them, but I don't see a way to even upload them to youtube for the automatic captioning program, unless I took a screencast of the pencast...yuck.  

Well, those are just initial impressions of the Livescribe pen.  I'm not writing it off (pun intended), but I had hoped for more.  I've already moved on to the next technology: I'm really excited to get a Wacom Bamboo Tablet and see how well that works in combination with Camtasia to make some amazing...Bamboo casts?  Wacom casts?  Tablet casts?  I think I just invented a new hybrid technology.  Tomorrow I am presenting these ideas to some faculty and I'm excited to find out what they think.

Friday, April 1, 2011

Yaptime

There are so many ways to collaborate and connect online.  Today I just learned about Yaptime, which is a private social network.  This is a great replacement for Ning, which is no longer free.  I can see student groups using it to collaborate on projects.  You can "yap" (aka post messages), upload photos and files, administer surveys, and create a group calendar.