Friday, November 14, 2008
Engagement Update!!
This is an update about the type of engagement that Joseph Carrabis waxes tediously semantic about here and claims is much harder to define.
I realized that my first post about engagement was actually very much related to these guys (Joseph Carrabis and Eric T. Peterson) and their "engagement project".
As an interesting aside (or <ASIDE> as Carrabis writes it), one of the posts refers to the fact that there is a patent application for an engagement formula. They mention that "someone working for Google" is the applicant (probably one of the inventors) but the assignee name is Yahoo Inc.
Ahhh...but there are two updates you must look at if engagement (definition 7) is of interest to you.
One of their posts here defines more clearly what the "engagement formula" entails. But EVEN BETTER is this post that shows and tells how to calculate it with Google Analytics' new features!! I've GOT to try that out. I only have one pressing question: WHY oh WHY couldn't I have found that AFTER finals?
And lastly, I'll leave you with this (30 minute) YOU TUBE video where Eric T. Peterson introduces himself for the first 4 minutes "for the few who have not bought my books" :-), and spends the rest of the time discussing how easy web-analytics ISN'T, and explains RAMP (Resources, Analysis, Multivariate testing, Process). He posits that knowing how to use web-analytics will determine whether you thrive or dive when web 3.0 hits. The last few minutes he gives real-life examples of how analytics has made millions. The most interesting example to me was one where analytics helped a company show evidence of "click fraud". And, as they had been paying $12-$15 a click, they were able to recover over 2 million dollars from search engines.
Read More......
Wednesday, November 12, 2008
Demystified, but still a little Foggy.
Though this posting's title could be describing me, that wasn't the intent! It's actually the way I feel about this really great post by 'The Future Collective' (whatever that is? Part of the 'Fog'?)
ANYway....I LOVED the post, and it specifically addresses engagement. As alluded to in a previous post, engagement is key in all learning. (DUH!) And 'The Future Collective' does a good job of showing where web analytics fits into the "equation". They/It define/s it simply enough: A+B=C where:
A + B = C -> (they are engaged) + (the site) = (to do what?)
* If you can describe what you want someone to do (”C”) and
* You know what the demonstrations of engagement are for your selected audience (”A”) then
* You can determine what the site (”B”) needs to be in order for “A” to happen such that “C” occurs.
The only FOGGY thing still is...that I DON'T know what the demonstrations of engagement are for my selected audience (learners rather than customers). It's not quite as straight-forward as a purchase, or putting something in a cart, or even registering. AND it probably varies quite a bit by person (thus the 'intended audience' qualifier). But my biggest hang-up is that I try to define what it is that demonstrates (learning) engagement in terms of what I know analytics can measure. Right now. Today. I'm pretty sure that progress will only come when we see past that limitation (that for all intents and purposes isn't real anyway!).
Glad I found the site. Will definitely be reading more. Read More......
Friday, October 3, 2008
Operationalizing Engagement
Whoa! Don't touch that back button YET!
What? this doesn't 'engage' you?
How about this version and explanation?
Σ(Ci + Di + Ri + Li + Bi + Fi + Ii)
Where
“Visitor Engagement is a function of the number of clicks (Ci), the visit duration (Di), the rate at which the visitor returns to the site over time (Ri), their overall loyalty to the site (Li), their measured awareness of the brand (Bi), their willingness to directly contribute feedback (Fi) and the likelihood that they will engage in specific activities on the site designed to increase awareness and create a lasting impression (Ii).”
"The components of the Visitor Engagement calculation are:
• Click Depth Index: Captures the contribution of page and event views
• Duration Index: Captures the contribution of time spent on site
• Recency Index: Captures the visitor’s “visit velocity”—the rate at which visitors return to the web site over time
• Brand Index: Captures the apparent awareness of the visitor of the brand, site, or product(s)
• Feedback Index: Captures qualitative information including propensity to solicit additional information or supply direct feedback
• Interaction Index: Captures visitor interaction with content or functionality designed to increase level of Attention the visitor is paying to the brand, site, or product(s)
• Loyalty Index: Captures the level of long-term interaction the visitor has with the brand, site, or product(s)"
You can read about that and MUCH more in the whitepaper entitled:Measuring the Immeasurable: Visitor Engagement by Eric T. Peterson and Joseph Carrabis. Engagement in the educational experience is essential! This too is a must read -- and an opportunity to join the conversation about engagement here. And just to whet your appetite, it documents Omniture's response to the subject of measuring engagement (and the author's response to their response):
"The same guys that want you all to believe web analytics is easy has now declared that “Visitor engagement formulas are largely another fad, just like parachute pants and the Hollywood diet. It’s a measure some consultants and vendors can pitch like snake oil.”Read More......
Omniture’s point that Visitor Engagement is a bad idea because it has subjective components fails to understand the work that folks like Jim Novo, Steve Jackson, Theo Papadakis, Joseph Carrabis and others have done; it makes me wonder if the author bothered to read anyone’s work on the subject."
Architects and Designers meet Analysts for best ROI
The article: Information architecture: Data-driven design: Using web analytics to validate heuristics system by Andrea Wiggins and a cited reference around which the article is written are must reads. Though never discussing web analytics and education directly, it explains the need for educational designers and information architects to collaborate with analysts from the beginning of the project, to maximize ROI. It begins with a very insightful quote that applies directly to education:
"...However, web analytics' greatest potential lies in improving the online user experience. When analytics data is shared with the design team, a subtler and more sophisticated user-experience design can emerge."
It discusses how web analytics can help evaluate and quantify the user experience using Robert Rubinoff's user experience audit. His audit has four main components, three of which apply very directly to education (functionality, usability and content) and the other (branding) indirectly. This is the most insightful article (and reference) I have read to date -- even though it does not address educational uses directly -- mainly because it points out the importance of analysts working with designers.
Read More......
Thursday, October 2, 2008
Adopting Web Analytics in Education: Why so S-L-O-W?
After elimination of the ‘have’ factor, what is left to explore is the ‘can’ factor. It seems only natural that the answer to the ‘if-WAs-can’ question is Yes! In fact, the summarizing statement in an article entitled: A Practical Evaluation of Web Analytics states:
“…it is apparent that the vast majority of work in this area focuses, unsurprisingly, on the business domain, in particular e-commerce. However, we would argue that such approaches could equally be applied to cultural and social settings, where understanding user behaviour has less financial impact but is crucial for the continued success of the social context.”
Though the authors did not specifically mention education by name, it too falls into both the cultural and social context where in understanding user behavior may have less (immediate) financial impact but is crucial in continued (and especially in expanding) success in an educational context. The authors do not, however, address the “how-WAs-can” question. There are several other quotes from that article that are worth-while reading from an educational perspective. However, for me they only highlight another question. This article was written four years ago, in 2004! What progress has been made with web analytics in education since then? Why has the research and early adoption been so slow, if indeed it has moved at all?
Is it funding? Is it because it’s not obvious (from research or actual implementations) that WAs can/will contribute to the “bottom line” of education as it does business? And what IS the “bottom line” of education anyway? Do we agree on this?
Is it accessibility? Is it because there is a much higher demand than supply for ‘experts’ in analytics, and business can make the investment, but education can’t?
Is it technology implementation? Is it because we are only in the first generation of web analytics (the assembly language) that is not yet accessible to educators? Will more adoption come with succeeding generations (authoring languages) of WAs?
Is it political? Have we still not reached consensus about the value and place of web-enabled resources and opportunities in the overall educational picture? Must that battle conclude before WAs in education can move forward? [There is no longer any debate about the impact or necessity of web-enabled business resources and opportunities]
Are educational goals harder to define than business goals? Or are they just harder to define in terms of what is now measured in WAs, instead of what could be measured?
Is it a matter of (excuse the recursion) education? Are there too few educators aware of the concept of WAs in general, much less the possible potential of WAs in education?
Or is it all of these things together, or something else entirely?
And the last and most important question: Which of these roadblocks that slow the adoption of WAs in education can (and will) we help to remove? Read More......
Wednesday, September 24, 2008
Wassout There?
Digital rhetoric: Ecologies and economies of digital circulation
by Eyman, Douglas Andrew, Ph.D., Michigan State University, 2007, 235 pages; AAT 3282094 This wasn't really about web analytics per-se, but pretty interesting, and probably applies more to the New Media class I'm taking. I think that it still might shed some light on how we glean and categorize data/information, ownership, and data 'fingerprints' in digital space.
Utilizing student data within the course management system to determine undergraduate student academic success: An exploratory study
by Campbell, John Patrick, Ph.D., Purdue University, 2007, 219 pages; AAT 3287222 This pretty much covers the aspect that much of what I skimmed on the web was talking about - using analytics through a LMS (Learning Management System). Not too exciting.
A conceptual framework for making knowledge actionable through capital formation
by Baker, Brett Michael, D.Mgt., University of Maryland University College, 2007, 169 pages; AAT 3254328
This is exactly what we've been talking about but focused on decision making in general rather than education. (Of course what we're really talking about though is making decisions in education anyway) So it applies generally - but I don't think it discusses web analytics as a tool - hard to say because there is no search capability - the text is just a picture. But here is a quote from the abstract that made me want to read more -- later.
Key elements of the conceptual framework included data mining, knowledge management, and capital formation processes to facilitate actionable knowledge for decision-making. Knowledge management and data mining developed along independent paths though each has a clear understanding that decision-enabling knowledge is one of the most important assets of any organization.
Then he pretty much defines the need that brought web analytics about:
The rapidly growing volumes of computerized data has keyed the need and development of more automated ways of extracting actionable knowledge
Automatic document-level semantic metadata annotation using folksonomies and domain ontologies
by Al-Khalifa, Hend S., Ph.D., University of Southampton (United Kingdom), 2007; AAT C828878
This uses del.icio.us tags for data mining. More along the lines of my New Media class than web analytics. But interesting.
An architecture for augmenting the SCORM run-time environment with a personalised link service
Studying the implications of hidden learning styles by tracing learners' behaviors in an eLearning system by Sawaan, Sara Yakout Mohamed, M.S., University of Louisville, 2006, 330 pages; AAT 1448633
Though this has a very unusual organization and style, and A LOT of 'learning style' theory to slog through, I think it comes closest to what we're looking for. Though it does not USE web analytics, it tries to gather the same data that an educational analytics package might.
Summary: Web analytics is a NEW emerging field, lots of room for new research and applications in education -- we haven't, as of yet, even scratched the surface.
Whoa! Who Knew? Web Analytics!
I thought I had come up with an original name. Analytics count and track, and kind of 'suck' important data out of the life of a session. Count Trackula seemed like a perfect name. Now maybe not so much.
How did Google analytics inform me? First I noticed (from the map) that I had a visit today from Italy. Italy? So I looked at the report. It came from a Google search engine on the words "count trackula"! So guess what I did next? Yep. And there you have it. Now I have two questions: 1) How many of you 18 'absolute unique visitors' knew that already -- but only thought it was amusing? and 2) Do you have any good ideas for a new name -- or should I bother? Read More......
Monday, September 22, 2008
Here Today...
Saturday, September 20, 2008
Strange Intersections
I’m taking two sections of IP&T 692R this semester: New Media (taught by David Wiley), and Web Analytics (taught by Clint Rogers). Each requires a blog. I started out with one blog for both classes but it didn’t take me long to realize that it would be less confusing for me, the professors, and my class mates (especially those who are NOT in both of the classes) if I used two separate blogs. Also, I thought it would be kind of useful to have two blogs on which to run the analytics anyway – so I separated them. [The one you are currently reading is for our Web Analytics class (Count Trackula) , but you may also have reached this through a link from my New Media blog. ]
If you give a mouse a blog…
One day I had the bright idea that as long as I have to be in blog-mode so much anyway, why not keep my scripture, idea, and remember journal in a blog? It would be in one centralized place – I could share it with my family that way, or others who were close to me. Sounded like a good idea – so I started a new blog (The Record Which I Make). But, just for kicks and learning, I thought I’d make it a wordpress blog (my other two are blogspot blogs). I did this partially because I was under the (mistaken) idea that wordpress blogs had the capability of pages. [As it turns out you can’t really have the functionality of pages at all if you host your blog on wordpress.com.] I had envisioned titles for the pages: Scripture Journal, Sudden Strokes (from the J.S. quote related to the holy ghost and ‘sudden strokes of ideas’), and Remember,Remember (from Elder Eyring’s Nov ’07 conference address). I began with just a couple of entries.
Fast forward to my web analytics class...
In exploring the google analytics that I had enabled for my two 692R blogs, I noticed that you get data back about search engine phrases people used to get to your blog. Cahlan’s blog talked a bit about getting a blog to rise to the top of google’s search engine by linking and comments, etc. So I was curious. Could I find either of my 692 blogs with Google’s search engine? How many pages down would they be? I tried words and phrases from my blogs, with no success whatsoever – I even checked 10 or more pages down. Nope. No luck. Then I thought: “try your journal blog”. I didn’t think there would be any chance of THAT search returning fruit, In fact, I was actually hoping for no results – after all it would eventually probably end up being a fairly personal blog in some entries – and I may not WANT someone finding it through a search engine. I knew I could password protect it, or keep it private – but at this point I didn’t think it was very likely anyone would happen across it anyway.
I brought up Google, I tried “A Mouth and Wisdom” (a title of one of the posts) – nope. Nothing. Next I typed “sudden strokes”. ... Enter.............Jackpot! Talk about ‘above the fold’!
I just stared at the results – no way! It was the very top link! What?? Cool! Exciting! Funny? Then I looked at all the other links that Google’s search engine returned. They were almost all about strokes – you know- the brain kind! But there I was at the top. Wow. So people want to find out about strokes, and they get my post about email, IM, Blogs, texts, and tweets. I’ll bet they were highly puzzled! Then I thought: “Should I have some sort of link there for those who find it accidentally? They come looking for a diagnosis or information or…comfort …and they find… my post? What link could I weave into my blog to help them, if by chance they started reading? Isn’t this sort of somehow (albeit a big stretch) like what Elder Ballard had talked about? So…I linked the words ‘premortal life’ in my post to an explanation on the newly redesigned www.mormon.org site.
There’s certainly no huge following of my blog (as evidenced by wordpress’s primitive stats). So far 64 hits – not counting me. And they probably all arrive by accident – what can I offer them? Any suggestions? Or should I just rename my page and tags? And HOW did this come to the top of the search engine? And what am I supposed to learn from this? Or is it just ‘coincidence’ and a simple consequence of not considering that the tag "sudden strokes" could be confused with something totally different?
Read More......Saturday, September 13, 2008
More Thoughts about Metrics to Monitor
I’d like to see path metrics. I think that’s already partially available. As an instructional designer I would be very interested in the path that users took through my 'course' including location and duration. It would be cool if you could see it in a visual 'path' – that might make it a little easier to notice emerging patterns. It would be interesting to see this per user, as well as aggregated for all users together (e.g. most used path, etc). Besides reflecting the content that the user found helpful or engaging, path data could also inform the designer a little about the UI. For example, a pattern of going back and forth between two pages, items, links, etc. might provide information to improve the UI – to put those items on the same page, or make them easier to see together in some way.
I’d like to be able to see more than just location though. I’d also like to be able to tag pieces of the instruction with objective / interaction / strategy or other types of informative tags – and have this information tracked for analysis. This would be especially helpful/interesting to compare with information from assessments.
I can tell that Google analytics can track clicked-on links (and display percentages). I’d like to know data on other user interactions too – for example scrollbars. If I have long pieces of text accessed through a scrollbar – is the scroll bar used? (if not they didn’t read all of the text) How is the scrollbar used? Does a user go straight down, up and down, or, all the way down – then back up? Do they scroll slowly (looking closely) or quickly (just getting to the end – and too quickly for reading). Similar data could be tracked on different types of ui components (pickers, dropdowns, dialogs, keypress, etc) as well as different types of users. In fact this may be one possible way to categorize different types of users.
I’ve tried to think about what clues I get as a teacher from observing a student. Engagement is key. Some analysis of engagement could be covered with the path, duration, and UI component metrics. However – very valuable information is drawn from interpreting body language and facial expressions. Having used skype in conjunction with a webcam and its software, I realize that there is fairly sophisticated expression tracking already available.
It would be interesting to research and define deltas in facial expressions that, on average, may indicate things such as frustration, boredom, interest, success, etc. Also, it would be great to be able to have time (or other defined triggers) activate a built in web cam – to compare snapshots over time. Granted you may not want to use this extensively – or for every student – but a judicious use would be helpful for a teacher to track an individual, and data from random use might also be interesting/informative. Of course you’d have to slog through all the privacy issues, etc. to collect this type of data.
Lastly, there are metrics that could be made possible through hardware extensions and/or external accessories – something that measured heart rate, perspiration, blood pressure etc. That’s a bit ‘out – there’ as of yet for educational uses, but definitely in the realm of possibility. Not only could this inform the designers, administrators, evaluators, researchers, etc, but best of all it has the potential for self-monitoring/training for the user. Read More......
Tuesday, September 9, 2008
Online Learning: Monitoring Metrics of Most Worth
First of all where do you draw the line on what is/is not a ‘learning environment’ when most all online environments could be broadly defined as such? Getting past that… I thought this would end up being a simple listing of some generalized data that would be helpful to know – and I could probably make a quick list of those. However, that got me thinking about how to systematically produce that list. The more I thought about that, the more complex things got. This is a rather haphazard enumeration of some of my thoughts:
1) Some of the data to track will be ‘application’* specific, and some will generalize to all applications, others will be specific to types or families of applications. It would be most helpful to start with a (maintained by someone?) list of the latter two, and then just have to think hard about what is specific (if anything) to this app.
2) One aid in helping to analyze or ferret out the data to track, might be to do so by design layers. I’ve not yet determined if imposing a ‘layered structure’ clarifies and helps to define the data to track – or if it just serves as a reasonable organization of “bins” into which the interesting data bits can be categorized. For example: right off I know I want to track where the user ‘came from’ and where s/he travelled in ‘my world’ and where s/he went after leaving. So I look at this and say – hmmm seems to fall into the ‘Control’ layer. On the other hand I might say: What else do I want to track in the ‘control’ layer? I also know I want to see what the user was “searching for” (if anything) when they found/arrived at my application, this seems to fit in the Message Layer. It gets really weird (with a folding over) when I started to consider the data/management layer – which is kind of what the web analytics is all about anyway, but we would probably want to track info about that as well. Is that meta-meta data?
3)Many of the attributes of tracked data are common such as: Counts (how many times), Durations (how long), Frequencies (how long between), and Sequences (order). These attributes would apply both internally(within my application) and externally (to/from my application)
4) Many of the metrics dealing with assessment (mastery/proficiency) could be found in SCORM, or other LMS -- but do we really want to take on that 'load'?
*By way of clarification, when I use the word “application” it is in a very broad sense and could be referring to a tool, widget, blog, wiki, learning object, an interactive (as if there existed a non-interactive) data-base, etc., etc.
Read More......What I hope to gain from our webanalytics class
Currently, I'm specifically interested in applications for:
1) computer adaptive testing
2) more information about the success (or lack thereof) of a pilot group blog I'm planning to set up in conjunction with personal leadership responsibilities. Read More......
Every Command with 'Eggs'actness
I've shared this with several classes I've taught - or attended. I haven't quite figured out what makes it so "funny" yet - but I do know that even though it took place years ago, I continue to find parallels to life and lessons I've learned.
One summer during my college years while working as a waitress I was having a particularly 'bad' day. The dining room was extremely busy, at least one waitress had called in sick, and the cook was even more cantankerous than usual, as were the customers – or so it seemed. I wondered why I seemed to get all of the patrons who wanted special menu items, treatment, or substitutions. As the morning wore on, we quickly ran out of several items, which did not improve the attitude of the cook or the customers. It was in this state that I met Ms. Demanding.
Ms. Demanding ordered a "2 minute" egg, meaning an egg boiled for two minutes. My first reaction was to ask her where she was from. She looked at me a little quizzically, and curtly answered "California". I then tried to explain that a two-minute egg in California was NOT the same as a two-minute egg in Utah. Obviously dismissing my comment as one which made no sense, she emphatically restated: "I want a two-minute egg! NO MORE, NO LESS! Do--you --understand?"
"Yes ma'am!", I replied as I quickly walked away, trying to decide whether to give her exactly what she ordered as a sure way of educating her, or to simply translate the boiling time in consideration of altitude. My inner debate was cut short when the cook told me to put the order up on the wheel, or go away until I was ready to do so. Of course when he saw what was ordered, he swore – at the customer and me. He also started to explain in as demeaning way as possible what a two minute egg would be like. I just said "I know, but she insisted." A sinister little grin crossed his face as he put the egg in the boiling water, and set the timer for exactly two minutes.
A few minutes later when the cook rang the bell to indicate the order was ready, everything looked great. This was a relief as previous versions of pancakes, waffles and hashbrowns had come out of the kitchen that day with a hue ranging somewhere between charcoal and midnight black. The soft boiled egg was in a little bowl with the shell still perfectly intact. I delivered the meal to the table where Ms. Demanding sat, and was met with: "We need more coffee here as soon as possible!".
When I returned to the table with a pot of coffee, she had cracked the egg open and viewed a two-minute egg in Utah – in all its slimy splendor. "Take this back – I can't eat this! I ordered a TWO-MINUTE egg!!" she shrieked.
I replied as civilly as possible: "Ma'am, that IS a two minute egg in Utah. Higher elevations require a longer cooking time."
"Well take it BACK and give me what I WANT!" she shrieked again, her pitch even higher. I wondered as I returned to the kitchen, just what she expected me to do, put it back in the shell, and boil it longer? Just then I noticed.........the microwave!.
[In my defense, and at the risk of dating myself, I need to explain that at this time, due to their cost, microwaves were not yet common household appliances.]
Just then the manager walked by, and I told him of my predicament. He warned me that I needed to watch the egg very carefully, as they would explode in the microwave if cooked for too long. So I took extra precaution to set the 'dial' on the very first mark (digital time settings were a future design improvement) and waited. I did this about three times, until the egg looked perfectly soft-boiled. It did not explode. Finally, something had gone right for me today! I hurried to return the egg to Ms. Demanding. Upon receipt of the egg she informed me that her waffle was now cold, and to take it back and re-heat it – another command which I diligently obeyed.
As I was standing at the microwave waiting for the waffle to reheat, my manager hurried into the kitchen area and told me "You'd better go check on your customer – her egg just exploded on her!"
It seems that at the very moment Ms. Demanding had pierced the yolk with her fork, her egg had stunningly exploded. I was told that it sounded just like a gunshot, and that everyone in the dining room jumped. I tried to compose myself, and to stifle any trace of a smile as I approached Ms. Demanding's table.
The egg was literally (if not figuratively) all over her face, and down her shirt. By the look on her yolk-covered face, I thought I was going to need to treat her for shock, but she quietly excused herself to go to the restroom. As she left, her husband exclaimed: "What the hades do you feed those *%&!@# chickens in Utah anyway, gunpowder?" (edited version)
I spent the rest of the day cleaning egg off of EVERYTHING. It was on the ceiling, it was on the chairs, it was on the carpet, and it was even on the walls clear across the dining room.
What did I learn? Simply this: "Don't nuke the egg!" (even if the client's requirements seem to demand it) Although it took several experiences in software development to connect the day in the dining room to the day in the conference room, it is a strong analogy that I will probably not forget. Read More......