Feb 26, 2009

Digital instruments - more accurate?

Putting the physics/chemistry teacher hat on again. Today, my SL chemistry class tried to determine a value for the specific heat capacity of water using Vernier probes, graphing software and (hopefully) their brains. It was the first time I got the new GO sensors out of the box so in doing my prep, I happened to have two sensors connected at the same time. I took a picture of what the software told me.


So what we have are two probes, in the same room telling two temperatures that are (by my standards) not even close to one another. I showed this to the students whose dumbfounded reaction implied I was performing some kind of science voodoo magic.

Quite often, when the 19th century skill of thermometer reading is required in a lab procedure, students will identify the thermometer as a weakness and suggest using a digital thermometer is more accurate. They are wrong on two accounts as I tried to explain today.

First, the students are confusing the meaning of accuracy and precision. I will concede that these digital probes may in fact be more precise (one needs to be careful here). Secondly, as the above result indicates, these probes are hardly accurate (or at least they can't both be). My question was "which reading is more accurate?". Of course, you can't tell (and in fact, a conventional alcohol thermometer read about 22 degrees - adding more fuel to the fire).

The point here is, students trust technology more than they should. They believe a digital probe is more accurate than a conventional instrument in most cases - this dogma evidenced by the massive confusion over why two digital instruments might disagree (one must be broken said one student).

We need to be careful about what we are really teaching them?

Feb 25, 2009

Digital Footprint revisited

Wikipedia has an entry for digital footprint (and even subcategorizes to "active" and "passive") explaining basically the idea is that you have an online presence if you are an internet user. In fact, a 2007 report states that 47% of users have searched for themselves online and you can even download digital footprint calculator software.



I was wondering if there is any thought out there regarding digital footprints in the environmental carbon footprint context. I have noted (through following a number of edubloggers on my RSS) that like minded people tend to follow each other and as a result tend to do a lot of cutting and pasting of one blog posting into another. This is my first example of what I am calling "blogarithmic" expansion. A theory I am putting forward that eventually megawatts of electricity will be required world wide to run servers (well mostly fans cooling servers) to store yottabytes of repetitious information. That will add up in terms of energy costs.

In a related matter, to those of us asking our students to contribute and collaborate in online projects by blogging, and wikis and nings etc. Are we deleting this information when finished? Or at least extracting and archiving relevant info to more energy friendly storage areas to free up server space. Hmmm!! (I am not going to even try to debate if the manufacture and burning of a DVD is more energy efficient than keeping that same amount of information on a HDD).

Musings on what works and what doesn't

I have been looking around for a physics/chemistry teacher that actually blogs on a regular basis with innovative, educationally sound ideas for what works with regards to technological integration in a senior level science classroom. I think maybe this is a niche I might have to try to fill (setting myself up here aren't I?)

A couple of experiences recently (see "why I make kids hand draw graphs") have stimulated some reflection which I will share with you here. First of all, I recently read Mark Prensky's post on barriers to technological adoption and adaption in classrooms and his comments on student's being digital natives reminded me of Chris Betcher's post about how the assumption that today's student is technologically competent is a myth. In a sense I agree with both of them. The "digital native" concept is very real, kids today are embedded in a world much different than the one I grew up in but as Chris points out, that does not mean we should assume they are competent. My arguement about student's own conceptions of "competency" was presented in the post mentioned previously. Sure kids are comfortable with computers but that does not mean they can use them effectively - and it is our job as educators, to integrate basic skills (like how to draw graphs properly using technology) into our classrooms.
A case in point with examples taken from my last batch of lab reports:

My point is any kid today can turn on a computer, load up a capitalist application like MS Word and type in a few sentences (two finger like because we don't teach keyboarding anymore) but I consider that unacceptable when simple features like "insert symbol" and "subscript/superscript" functions elude students. This level of technological adoption is considered doing "old things in old ways" by Prensky, the second lowest level of the progression towards "Edutopia". I argue that if the training wheels are still needed, let's send that message loud and clear and not overlook it in our quest to do "new things in new ways".

My second experience happened quite by accident (most of the best teachable moments do) and came about when I was discussing a recent assignment with a higher level physics student who graduates in June. The assignment was to use a spreadsheet (MS Excel) to model what happens to an alpha particle when it is accelerated through a potential difference. The kids were given no parameters, just an assignment to produce a number of graphs. Most students found it tough, the equations are difficult and added to that, the numbers are extremely difficult to represent graphically. They did however enjoy the challenge and provided feedback that although hard, they really understood the concept.

The product the kids produced is exceptional. For the most part (pat myself on the back here), these students are very competent with the software, and produce highly professional reports. I have spent lots of time helping them and they in turn collaborate extensively not on how to necessarily do the assignment but how to get the software to do what they want. How did this happen?

This is the third of these types of assignments they have done. The first was very simple, and the second a more complicated energy in an oscillating spring system problem. There was no plan for the sequence to be even a sequence or a progression, just me giving them suitable tasks at an appropriate time.

There is nothing fancy about this, no cool hook, no internet gadget - just kids, using a very relevant piece of (dare I say it) 20th century software, in a meaningful manner, that also created a valuable educational experience to supplement a relativity lecture. Why then do I feel like I am not "keeping up with the Jones" of edtech?



Adopting and Adapting

I recently read Mark Prensky's article "Adopt and Adapt: Shaping Tech for the Classroom" and thought a great deal about the context and my ongoing reflective theme "why am I, a progressive, tech-savvy educator, both enthusiastic and resistant about many of these ideas"?

I don't disagree with Prensky's message. I do not suffer from what I like to call "pedagogical inertia". I do feel uncomfortable though but by what I cannot put a definitive finger on.

Prensky wonders that by 2100, "How close will we be to Edutopia?". This sets off certain cynical alarms, because as admirable as that pursuit is, it is highly unlikely that we will ever get there. My 15 year career as an educator has seen many phase shifts - cooperative learning, PBL, UBD etc to name a few. I find it highly unlikely that given a perfect one-to-one world with tech-competent educators delivering digitally integrated and enhanced curriculum that the coming of the educational Messiah will be celebrated and no one will ever think again on how to make it better. This smells with the essence of LaPlace's (foolish) boast (by knowing the initial conditions—the position and velocity of every particle in the universe—he could, in principle, predict the future with absolute certainty) - an obligatory physics/math connection that basically stated the demise of physics research.

Mark is right on though with his 4 step process of the typical adoption of technology in schools (or for that matter - industry and society in general)
  1. Dabbling.
  2. Doing old things in old ways.
  3. Doing old things in new ways.
  4. Doing new things in new ways.
I like to think that most educators are well beyond "dabbling" - at least they are in the environment I work in (which too might affect my perspective - top tier international school vs public education in Fort McNowhere USA). So moving on to "doing old things in old ways", I don't disagree with Prensky's idea that "writing, creating, submitting, and sharing work digitally on the computer via email or instant messaging (is) in the category of doing old things (communicating and exchanging) in old ways (passing stuff around)." I do however strongly agree with Chris Betcher's "myth of the digital native" and don't necessarily believe our students are as technologically competent as we think they are (see my next post) and therefore a little more time to smell the roses in this area is necessary and from the perspective of a 41 year old digital immigrant, absolutely necessary.

Although idealists like Prensky are the necessary force that drive a necessary change, a more pragmatic approach than "if we don't do this yesterday, our students are screwed" is not educationally sound in the eyes of the 50 year old teacher that is the very focus of the "no teacher left behind" educational technology program explosion. Faced with a barrage of rapidly changing tech options in education (blogs - wikis - nings), often presented with little explanation at near light speed by techies spouting gobbledy (sp?) gook - it is no wonder that "digital immigrant" teachers (even tech competent ones like myself) are being resistant to change.

I see it and feel it. You want teachers on board (because as Prensky states digital immigrants present a huge social barrier to technological adoption), put away your 7 different gadgets (cool sure but I am not really impressed), stop speaking in tongues, and create plans in schools that shift mainstream thinking gradually. Graphically I see it like this:
This is the impression mainstream educators get in terms of what is expected from people like Prensky
This is my proposal for the required perception of expections that I think will really help dismantle those barriers people like Prensky talk about


What I like best about this article is Prensky's observation that not being one-to-one is the biggest obstacle for success. Put me on the line with the sign "we need more computers NOW". I firmly believe that tech adoption will be a jerky, inconsistent process until this happens. As a school director I would insist on directing the funding here first. Maybe we won't be doing new things in new ways intially and light year leaps in test scores will not occur but trying any kind of technological adoption without one-to-one is like teaching kids to write with shared pencils. The laptop (notebook) is the binder and pencil case of the present. No more excuses please Mr. Purse Strings, just do it!

My Project Sketch

The final project assignment for the course I am taking has the following goal statement:

Goal: Develop an authentic and engaging project for your students which meets both your curricular standards and at least one of the NETS standards.

In high school science, we do not do many "projects" but rather focus on experimentation and inquiry. As much of this process develops valuable hands on laboratory skills in grades 9 and 10, I am thinking of focusing on senior classes where certain aspects of the curriculum do not easily lend themselves to experimentation in a traditional sense. I see this as an excellent opportunity to incorporate some aspects of technology.



The more I think about it, the more I feel that there are a number of opportunities within the IB Physics Relativity unit. Although I question how "authentic" in the true science research sense these opportunities are (there is very little authentic science done in schools - depending on how you define authentic), there is an opportunity to create authentic project tasks that although not typical of relativistic physics research, certainly can provide a valuable experience.

At present, my ideas include various models for relativistic vs non-relativistic physics with regards to mass, length and time variations in objects moving at near light speeds. To keep things authentic, and also to tie in with syllabus objectives, the two best ideas would be to look at particle accelerators (what happens to small (like alpha particles) in an electric field) or to recreate/simulate muon experiments that provide conclusive evidence for half-life dilation and hence special relativity theory.

Students could be given a set of parameters (probably quite broad) and be asked to model and present the results that (theoretically) would be observed. With regards to alpha particles, different groups could look at different aspects of the problem and then collaborate together to produce a comprehensive project.

Students would use spreadsheet software for the modelling exercise, and present results in a power point or other media format to the rest of the class. Each group would contribute results to an overall project composite - probably on the class WIKI. Discussion, reflection, feedback and editing can occur to "stream line" results into a concise format that would be suitable for examination review.

NETS - Standards for Students Addressed
1. Creativity and Innovation (parts c. and d.)
2. Communication and Collaboration (part a.)
3. Research and Information Fluency (part d.)
4. Critical Thinking, Problem Solving and Decision Making (part b.)

Feb 23, 2009

Why I make kids draw graphs by hand

I had a really interesting experience in my chemistry I class the other day. We are studying properties of matter and as part of physical properties, we look at changes of state, and graph temperature and time data in an attempt to determine the melting point of a pure substance (stearic acid, comment if you want more info).

My enthusiastic, grade 9 budding Einsteins are asking "can we use computers to draw the graph?". I hesitate for a second before thinking with my 21st century hat on then responding "I would prefer you to hand draw following the instructions you have been given but if you wish to use MS Excel or equivalent that is OK". Stage set.

Jump to graphs being returned to students. Some dismay is evident and I find it very interesting that students, in general, are under the impression that computer generated graphs are somehow worth more marks than hand drawn graphs. As I pointed out errors in presentation and format to one student, his response said it all : "but I used the computer!" was his plea, as if that somehow made an incorrect graph worth more marks.

What were the errors? Here is a partial list of things I saw:
  • line graph chosen (should be x-y scatter plot)
  • inappropriate scale (detail of melting point not visible because software autoscales from zero)
  • HUGE points (points are just that. data points should be tiny and surrounded by what we call in the business "point protectors")
  • inappropriate trend line (computers are notorious for this)
I reflected on this student's comment and started to wonder where is this attitude coming from? Why do my students equate computer with "better" (they do the same with digital instruments vs analog and in almost all cases are entirely incorrect). Encouragement to use technology is happening in classrooms globally but I am wondering if "any use" is better than "less but correct use"?

There are also two learning issues here. The first, is the documented course objective of how to plot data in a correct manner that effectively communicates the data. This is what I am trying to teach. This is why said student was not as successful as he had hoped in terms of how his work was assessed.

The second learning issue is who/where/when are these kids being taught (in the scope of say their science curriculum progression), not only the point of graphing, the principles of graphing but how to use Excel (or other software) correctly. I emphasize the option of using other software because graphing software varies in its appearance but not so much in its function. If the students understand the purpose of the function (in terms of creating the graph - like for example scaling an axis), then the brand of software does not matter. Somewhere (maybe my classroom), this skill is not being formally taught. My issue here goes beyond blogging and wikis, to the real skill which is graphing (or writing - reflectively, or other ways), and how we as educators are inconsistently integrating use of technology (sometimes incorrectly, sometimes ineffectually) and also detrimentally to students, creating the impression that digital media somehow improves the quality (presentation/look excepted) of their work.

Comments on Connectivism






















George Siemens paper "Connectivism: A Learning Theory for a Digital Age" took me quite some time (and more than one read) to digest. I wrote down a number of ideas over a number of days and am just finally getting around to publishing my reflections.
I am not sure how I feel about learning theory in general. As I read this paper, the irony in the Gonzalez (2004) paper quote was not missed:
“One of the most persuasive factors is the shrinking half-life of knowledge. The “half-life of knowledge” is the time span from when knowledge is gained to when it becomes obsolete. Half of what is known today was not known 10 years ago. The amount of knowledge in the world has doubled in the past 10 years and is doubling every 18 months according to the American Society of Training and Documentation (ASTD). To combat the shrinking half-life of knowledge, organizations have been forced to develop new methods of deploying instruction.”

I say irony, because "connectivism" (which I guess is new enough not to pass the MS Word spell checker), is a learning theory and like other knowledge, has a half-life (according to Gonzalez and quoted by Siemens), so I do (with a wry grin on my face) ask myself if the this paper (also dated 2004) is already obsolete? That said, I found much of this paper relevant and thought provoking: Siemens quotes Karen Stephenson's question "How can we continue to stay current in a rapidly evolving information ecology". This is a great question and one that I (feeling somewhat ignorant and overwhelmed) have asked myself quietly a number of times in the last few weeks. I feel pressure to be at least in front half of the pack with regards to shifts in educational approaches that reflect applications of technology and information literacy but when I access the huge amount of information out there, I wonder how anyone keeps current and still teaches full time. The whole process seems to require proficiency in "bandwagon" jumping in the sense that the application du jour might be obsolete or replaced in less than a single year of teaching. I have other (and I feel more important) things to focus on: like how am I going to get my head around general relativity and present it to my students in ways they will understand?

"Sure", some would say, "use the internet, set up a blog or a wiki, do a podcast" - "sure", I say, as well as cover the syllabus objectives completely, ensuring students understand, in less than the allotted time, because the kids are writing a standard exam in May.


I don't really need another learning theory. What I need are time-saving resources and ideas, appropriately leveled for my IB students, that scaffold other classroom experiences in developing relevant skills that will help them (the students) see success, and secure employment in future (yikes - see last paragraph). Oh and don't forget, they still need to learn about relativity (or maybe they don't??).
My connection here is that I don't have time for idealism (many full time classroom teachers don't).

I agree with the basic points of the article (to some extent - see paragraph below), I am trying to move with the times, but just when I feel I am current, something else crops up that (I am not trying to be funny here) makes me feel behind, ignorant, and an ineffective educator. Maybe I take it too personally?


Final point, the cynic strikes! The principles of connectivism have the subtle smell of the business model of education. In other words, working towards what a 21st century CEO wants in a graduate (see Alfie Kohn's excellent take on this) . I cringe at this thought wondering idealistically what happened to love of learning for the sake of just, well, learning.

Feb 14, 2009

And in related news

My top bloggies these last few days
Reading these posts makes me thankful for the fact that my working in an international school for the most part eliminates government meddle in education policy. International schools can be insular though and would do well to collaborate to establish a common blueprint for the integration of 21st century skills into curricular change for the future. Many hands make lighter work, and it seems like there are a number of independent inventors struggling to create a new wheel. I am still looking for a road map to change - steps to developing this blue print - so that I can more clearly see routes to desired outcomes, rather than feel a slave to all the hype (better get going teach or you are gonna fall behind). The ringleaders are all about blogging and sharing and collaborating but what I see is an exchange of ideas without the background of a structured framework. Where is this framework? Should it exist?

Feb 13, 2009

Year of the atheist!?


Yesterday was my 41st birthday. I have been alive 5 (soon to be 6) decades and sometimes feel like my life is only just starting to gain some momentum. I want this year to be transitional, a change from what was to something new. I guess, I am starting to feel (both psychologically and physically) a real sense of my own mortality.

I am going to start by declaring February 12th to be the new "New Year's eve". The chinese and muslims (and no doubt others) work on a lunar calendar (where the new year shifts back about 11 days in our julian calendar) each year. I don't propose anything so radical, we just entered the year of the ox in china, 2009 in the normal sense of timekeeping, my 42nd year is going to be the year of the atheist.

The year of the atheist represents a paradigm shift in my thinking and approach to getting what I want out of life. It is the year I explore my potential as a thinker, a doer, and somehow make a measure of my contributions to humanity. The choice of atheist makes sense to me for a number of reasons.

I am reading (and rereading) sections of The God Delusion by Richard Da
wkins and find this book to be inspirational in terms of my approach to Feb 13th 2009 - Feb 12th 2010. Dawkins is a courageous voice for science educators, free thinkers, and those sickened by living in a world where much evil and hate is brought on by ignorance and blind faith. I looked up "atheist" in the thesaurus on dictionary.com.

Note the reference: Roget's "21st Century" thesaurus. It interests me that "21st Century" seems to imply progress and is being thrown around with reference to "skills" in education circles like it is the reformation. Synonyms for "atheist" like "heathen", "infidel", "irreligionist" and "pagen" are hardly 21st century but rather reflect a more "inquisition" like attitude. Surely we have come further than that in today's society but the negative stigma associated with the word "atheist" is real, very evident, and for me, a stimulus for discussion (oh the controversy).

I have been a closet atheist for some time now. Dawkins has convinced me I no longer need to be that. I also think that the "you're an atheist" exclamation by others injects some much need controversy into my life. Not that I want to argue the existence of God with anyone (read Dawkins if you want to do this), it is just the fact that the statement symbolizes my desire to cultivate myself as a free thinker.

Other projects for the year of the atheist include my version of the 366 day photo project. Like others (this is not my idea), I am proposing to take and choose one picture for each day of my 42nd year. Picasa and Flickr are being debated right now as my technology medium of choice to keep this record.

I am coining terms that I hope enter mainstream vocabulary in the year of the atheist. "Blogged down" describes how many students feel in our classrooms right now as teachers jump on the 21st century bandwagon. I also hope I am the first person to go on record with the idea of a "digital footprint". More about this later.



Feb 8, 2009

How are my thoughts changing?

I know I need to find more time for this course than I actually have. Reflections take time and often (when under deadlines) are more "knee jerk" reactions than being actually reflective. I can now officially say I have skiied in China and after a weekend of sunshine, near misses and very spicy food, I find myself staring a large monitor screen wondering where to start with regard to the last week's readings.

I have been looking at the revised Bloom's Taxonomy for the 21st century. My perspective as an IB Science teacher questions "creating" being placed above "evaluating" as the latter is the area students struggle most with in laboratory situations but if I take the perspective that creating implies the production and development of a "novel" idea, then the scheme works well in my mind. The subtext of this comment should read "just because some element of design/planning is involved, does not necessarily mean my students are functioning at the top level of the HOTS". I take this position and apply it to the new "digital verbs" as they are integrated and commented on by Andrew Churches into this latest taxonomy revision. I struggled with this article but found it relevant to comments I made in my Where, When, How? post just recently. After being recently exposed to what appears to be a jump on the bandwagon free for all integration of anything ICT into our classrooms , I took the point of view of science educator and curriculum developer (what I know), and basically asked if there is an accepted scope and sequence for the introduction and development of ICT skills in our curriculum. Conveniently, the next article I read (Andrew Churches') gives me a list of 21st century skill action verbs pigeon holed into the revised developmental taxonomy. An excellent start but in some ways left a trying to ram a square peg into a round hole taste in my mouth. I am more confused than I was initially, but more determined than ever to find a scheme that makes sense. What I am getting at, is basically this:
1. I absolutely believe that educators must change with technology and society in order to understand our students, to best help them learn, to keep them engaged and to prepare them as best we can for the future.
2. I love teaching science through a curriculum that has an established skills progression embedded throughout and one that encourages inquiry.
3. I know ICT tools are critical and must be integrated BUT although enthusiastic about all the different excellent examples of ICT in classrooms I have seen recently, I believe this revolution needs structure, guidelines simplified for educators migrating in, consistency and educationally responsible practices.

The certificate program I am enrolled in is a start.

Feb 5, 2009

Where, when, how?


Yesterday I sat through an all staff meeting at my school and like many others got a glimpse of the future of curriculum planning and development. Our curriculum coordinator has done an admirable job of integrating the current thinking into a plan that includes the hiring of our school's first 21st century literacy coordinator. There were also elements of the 21st century revision of Bloom's taxonomy and a presentation of how our future curriculum mapping system might look.

Now, this is mostly moot for me as I am moving to a new school in August but I thought about it critically (some might say skeptically) and saw what is going to be a huge project that will be very current and I think difficult to implement. What got me thinking was that in the curriculum mapping/tracking system, there was a field to input the relevant 21st century skills that are supposedly being developed/assessed in that particular unit. The mapping function will allow tracking these skills from K-12 in all subject areas. Brilliant right? Hmmm.....

I tied this in with a discussion from a colleague in Manila who thinks kids are suffering from what I am going to call "Blogoverload". The requirements to blog are in many courses, as teachers jump to integrate the current trends. Related to this is the "Bloguniverse" I now live in and as an early citizen, have noticed that many blog posts are simply links to and comments about other people's blog posts. It is a good thing we are in a paperless age but that doesn't stop me from thinking that terabytes of storage, and gigabytes of bandwidth are the minimum requirements to support millions of school children who be-clogging the internet.

I am not trying to be negative, and some of you in the know may point out that I don't know what I am talking about but my point really is that there is some organization required here. There are definite progressions in terms of general and subject specific skills that we as educators are all aware of. For example "inquiry" in a science investigation looks different in grade 6 than in grade 11 and there are a number of developmental steps along the way. I am a firm believer that an effective curriculum in the current sense uses content as a vehicle to negotiate a skills continuum super highway. Is this type of guidance needed for all of this 21st century skill integration in our currently developing schools?

In a post entitled Kinder can do powerpoints too! , Silvia Tolisano links to a presentation regarding even young students doing interactive presentations. "Great" I say but as I stick to my guns about my position on "effective learning", I see a bigger picture of a mish mash of 21st century skills being incorporated in an inconsistent manner. Schools, slow down, think. Develop a skill set, categorize and create progressions, incorporate consistently in course development. I think this is harder than it sounds, the progressions in specific subjects (math, reading) are clear cut. Not so with ICT which has the added difficulty of always changing and also rapidly doing so.

Feb 4, 2009

Information overload?!

I am writing this in stages as I am slowly catching up with this course and feeling like I have already collected immense volumes of information about everything "21st century education". I set up my Google reader and as I careen at near light speed into the world of the Personal Learning Network, I need Googles to protect my eyes from the insane amount of information shrapnel emerging from my CRT (or should I say LCD).

In some ways, I find I love this thing (my PLN) in that it provides (and has the potential to provide further) all the information I could ever hope to want or need. On the other hand, I stare at the screen the same way I stare at shelves of books in Barnes and Nobel - not really knowing what is good, what isn't, how to discern, and in a paranoid kind of way, I worry about missing something.

I took a break for a few minutes there to reflect on what I have written and also browse the feeds in my Google reader - suggested by the COETAIL facilitators and decided that I need some kind of filter to sift through all the posts etc (they can't expect me to read all of them can they?) that only spits out the stuff I want to read (or I guess should read with regards to my course commitments). I actually think such a filter exists and it is called "time and experience" - something that is always at a premium in the world of teaching IB Physics and IB Chemistry.

Is this where I am headed? See overwhelmed (Jeff Utecht)

I know I need to find the time - for browsing - for finding what I like - tracking it - and booting out the rest. With regards to my course, I still don't know what I am looking for being just a mere a fledgling in the buzz of 21st century education and still a little hesitant to jump from the nest. I am guessing the pieces will fit together somehow and with a little patience, and some guidance, a tangible picture will form.

What I did find (somewhat randomly) was a great link to a piece of software that is FREE and an alternative to Starry Night for those of us that need to teach astronomy. Check out Stellarium. The most interesting blog post I read dealt with the false perception that our students are technologically competent. Read on here for more information (I especially like the examples and it made me think about the possibility that some of my expectations are unrealistic)

Feb 2, 2009

Me a skeptic? I hope you have proof!

It was a nice surprise to be informed of the opportunity to participate in the ISB C.O.E.T.A.I.L. shortly after accepting a teaching position there. "21st century literacy", "technological competency" and other buzz words have been floating around IS Beijing (the other ISB and my current employer) for a few years now. I remain a self described "digital immigrant" (but unfortunately one with an inane aversion to blogging) as I am too old to be a "digital native" (the kids we teach) but also somewhat of a skeptic - hoping somebody, somewhere, will cut through the fluff, the undeniable sociological relevance and "show me the money" with regards to solid connections between the integration of technology in classrooms and improved learning.

I think I can say this is my principle goal from this certificate program. I am also hoping that I can learn a few nifty things along the way and further develop my knowledge base and finally, what a great opportunity to network in advance with future co-workers.

So I began my course readings in earnest, open to new ideas but must admit, predisposed to others. By all means, I can see the benefits of "engaging", through technology, otherwise "enraged" learners, but there are many pitfalls to negotiate and I somewhat hesitantly find myself occupying the role as bearer and waver of a big caution flag.

We read in Ito et al. (2008) about “hanging out” being one of three “genres of participation in the new media” and I cannot help but to imagine myself at the age of the students I teach and what I was doing. At the time, “teen phones” were becoming quite popular in my home town, in the sense that parents were installing second, what we now call, “land lines” (is it me or does the term often come across with some disdain?) for their kids to use in order to prevent their “hypersocial” behavior from disrupting the less frequent parental requirement to use a telephone. My point is that teens today are not much different than when I was a kid but I absolutely concede (in line with the descriptions given in the paper) that HOW today’s teens conduct their (not so different) social relationships has certainly changed.

I internalize this briefly along with the point Prensky (2005) tries to make and all I can come up with is that new approaches to teaching that integrate the types of technology kids are using today are necessary in order to keep students motivated to learn in an otherwise disconnected environment (i.e. our schools). “No kidding” I say to myself as I ponder what the grounds are to contest this obvious argument. I wonder how successful I would be if I still used a slide rule in my physics class in today’s age of graphing calculators (each containing enough computing power for 10 Apollo moon missions) – this reminds me of another ironic observation regarding “engagement” which I hope to remember to address later (add link here).

In my mind, the laptop computer, internet etc of today are the binders, pencil cases and libraries of my day in school. Failure to acknowledge this is not just supposedly ignoring potential learning but is also dramatically unfair to today’s students who will need a specific skill set (I am not sure if cursive writing is part of that) to function effectively in the world we do/will live in. Their (the students’) technological competency is a necessary component of education in any of today’s schools worth their salt, and how fortunate we don’t have to convince them that “learning to use computers is good for you” (my mother forced me – I so wanted to take wood shop - to learn to type in my first year of high school).
In this I am very specific, first and foremost to ask, "am I contributing to developing a useful skill set in my students?" Second, "is this application etc actually enhancing their understanding of physics?", and finally, "is the implementation of the new strategy etc more efficient?"

I am a skeptic, and I ask these questions because I worry about what I call "educational window-dressing". Today's international (and probably other schools) are in competition for students and marketing initiatives such as glossy annual reports full of numbers and photos of students using computers prey on the conception that (kids + computers) x school = quality education. Administrators and educators buy into the game as well but I'm not biting, all things considered equal, the impression that the level of IT integration in a classroom is an indicator of good teaching.