Failed Sessions

For exactly two months now I have been working on a re-opened issue (on Oct 7, 2009) where sessions appear to die in Blackboard Vista 8.0.2 hf1.

The first time this came up, Blackboard support wanted us to overhaul the session management. BIG-IP documents saying attempting this new method was a horrible idea caused us never to get on board. We agreed to conduct dupe.pl tests which showed there wasn’t a problem with session spray, which the solution was designed to resolve. Stonewalled, we closed the ticket when the institution reporting it didn’t have any cases to provide us.

So our client with the issue asked us to resume work on it. The key information they provided me was their users hit the /webct/logonDisplay.dowebct. Since they use Single-Sign On (SSO) from a portal, no users should ever hit this page. From investigating these cases, I was able to find a number of cases of users hitting /webct/displayAssessment.dowebct or /webct/displayAssessmentIntro.dowebct with the guest user.

See, the guest user exists at the domain learning context. Users appear as guest before they login or as the logout. They should not appear as guest when taking a quiz.

So I provided this information to Blackboard with the web server logs. They wanted more cases, so I provided more. More clients reported the issue, so I had plenty of sources. Plus it pointed to this problem affecting at least 4 if not all clusters.

Next, our TSM left, so we were provide a new person unused to us. It took just the first note to make a huge mistake. “Provide us all the logs from all the nodes.” At 5GB of logs times 14 nodes in a cluster, 70GB of information for an event which took up maybe 10KB seems like overkill. So… No. I like to think of my self as proficient at system administration, which means I can gather whatever logs you desire.

Now we come to the second mistake. Please refrain from asking me questions already explained in the ticket. Sure, the ticket has a large amount of information. However, if I can remember what is in the ticket, then so can the people working it.

Unfortunately I had to answer a question about replicating this with: it was based on my log trolling not actual cases of students complaining. My mistake was not going to the clients to find a description of the problem. Therefore, Blackboard wanted a WebEx so I could explain the same one sentence repetitively. *headesk* We agreed on me getting a case where a user could explain the problem.

As luck would have it, I got just a case a few days later. So I captured the web server log information and sent it along with the user description. My laziness resulted in me not trimming the log set down to the period of the error. Therefore, this log set showed a user1 login, user2 login, then user1 login again. Blackboard responded this might be a case of sporadic shifting users. Hello! I guess these folks are not used to seeing the SSO login to be able to know the session shifted to another user because… it… logged… in?

By pulling the entries from the f5 log showing the client IP address, Blackboard now wants us to implement a configuration change to the f5 to reflect the browser’s IP in our web server log. Getting such a change isn’t easy for us. Don’t say this is the only way to get client IPs when I… have… sent… you… client IPs. We’ve been at this impasse for 3 weeks. So I get to have another WebEx where I explain the same thing I’ve already written. *headesk*

Maybe it is finally time to ask the people if they are at all familiar with the known issue which sounds like the issue?

VST-3898: When taking an assessment the session is not kept alive. The student’s session times out forcing the student to restart the assessment or makes them unable to complete the assessment.

We plan to implement the upgrade which resolves this issue next week. So, I am hoping this does resolve it. Also, I am tempted to just close this ticket. Should the institutions find they are still having problems in January when the students have had a few quizzes fail, then I might have forgotten how utterly completely useless Blackboard has been on this issue.

All I ask is:

  1. Know the information in the ticket so I don’t have to copy and paste from the same ticket.
  2. Don’t ask for all the logs. Tell me what logs you want to view.
  3. Don’t tell me something is the only way when I’ve already shown you another way. I’m not an idiot.
  4. Don’t ask me if the f5 log has the cookie when the entries I’ve already sent you don’t have it.

ūüôĀ

State of the LMS

Watched an informative WebEx about The State of the LMS: An Insitution Perspective presented jointly by Delta Initiative and California State University. An true innovator in this market could become the leader.

Market share numbers annoy me. These are always self-reported numbers from a survey. The sample sizes are almost always not very impressive and when broken down doesn’t really represent the market. DI didn’t post a link to where they got the numbers just the name of the group.¬†Some digging and turned up this Background Information About LMS Deployment from the 2008 Campus Computing Survey. For background information it is woefully lacking in important information such as sample size, especially the breakdown of the types of institutions in the categories.

The numbers DI quotes of CC are very different for the same year the Instructional Technology Council reports: Blackboard market share 66% (DI/CC) vs 77% (ITC). An 11% difference makes is huge when the next largest competitor is 10% (DI/CC).

Other missing critical information: Are these longitudinal numbers, aka the same respondants used participate in every year the survey quotes? Or is there a high turnover rate meaning an almost completely different set of people are answering every year so the survey completely relies on the randomness of who is willing to answer the survey? So the numbers could shift just because people refuse to answer giving Blackboard reduced market share only because Moodle customers are more willing to respond to questions about it?

Most of the major LMS products on the market started at a university or as part of a consortium involving universities. I knew the background of most of the products on in Figure 1. Somehow I never put that together.

Will another university take the lead and through innovation cause the next big shakeup? I would have thought the next logical step to address here in the DI presentation would be the innovative things universities are doing which could have an impact. Phil described Personal Learning Environments (not named) as potentially impacting the LMS market, but he was careful to say really PLEs are an unkown. The were no statements about brand new LMSs recently entering or about to enter the market.

Figure 1: Start year and origin of LMSes. Line thickness indicates market share based on Campus Computing numbers. From the DI WebEx.

Network Recording Player - State-wide LMS Strategy 8262009 90839 AM-1

When people use my project as an example, it gets my attention.¬†GeorgiaVIEW was slightly incorrectly described on page 26 Trends: Changing definition of “centralization”.

  1. We do not have an instance per institution which has a significantly higher licensing cost. We do give each institution their own URL to provide consistency for their users. Changing bookmarks, web pages, portals, etc everywhere a URL is listed is a nightmare. So we try to minimize the impact when we move them by a single unchanging URL.We have 10 instances for the 31 institutions (plus 8 intercampus programs like Georgia ONmyLINE) we host. Learn 9 will not have the Vista multiple institution capability, so should we migrate to Learn 9 an instance per institution would have to happen.
  2. We have two primary data centers not have a primary and a backup data center. By having multiple sites, we keep our eggs in multiple baskets.

The primary point about splitting into multiple instances was correct. We performed the two splits because Vista 2 and 3 exhibited performance issues based on both the amount of usage and data. With ten instances we hit 20,000 4,500 users (active in the past 5 minutes recently) but should be capable of 50,000 based on the sizing documents. We also crossed 50 million hits and 30 million page views. We also grow by over a terabyte a term now. All these numbers are still accelerating (grows faster every year). I keep hoping to find we hit a plateau.

Figure 2: LMS consortia around the United States. From the DI WebEx.

Consortia Nationwide

All this growth in my mind means people in general find us useful. I would expect us to have fewer active users and less data growth should everyone hate us. Of course, the kids on Twitter think GeorgiaVIEW hates them. (Only when you cause a meltdown.)

UPDATE: Corrected the active users number. We have two measure active and total. 20,000 is the total or all sessions. 4,500 are active in the past 5 minutes. Thanks to Mark for reading and find the error!