Nearly two weeks ago, Google+ launched Pages, a version of a person profile for non-people. (Google does know the Supreme Court deemed corporations people too, right? So corporations should have a person profile.)
Companies desiring a social media presence have created a page in addition to their Facebook pages, Tumblr, and Twitter accounts. Over the past couple weeks, I have seen a number of posts on Facebook and Twitter alerting me to the new G+ page. They invariably ask me something like “Make sure to follow <corporate name> on Google+, too.”
I am already following you on one of these which is how I saw the message. Following you on two, three, or more social media sites gets me what exactly? The same post multiple times. Maybe I notice something important faster. That might be one in two hundred posts? More likely I will shift the important followings to where I tend to spend most of my time.
This is the same strategy I use for following friends. At least some of them tend to post different things in different places.
In areas like North America, Europe, Australia, and Asia’s advanced countries, computer cost is no longer an issue. Dell’s cheapest computer costs $379 (with a monitor) and is about 500 times as powerful as the Macintosh Plus I used to write my Ph.D. thesis. While it’s true that a few people can’t even afford $379, in another five years, computers will be one-fourth their current price. Would that all social problems would go away if we simply waited five years.
So $379 / 4 = approximately $95.
Dell, the company Nielsen picked on, the cheapest I found was in Dell Outlet a Latitude laptop for $239.
Walmart’s cheapest non-refurbished I found was $212 laptop. (There was a Pentium 4 refurbished desktop for $115 which is old even for 2006 but adding the cheapest $89 monitor is still $109 too expensive. You would be better off going to a garage sale and picking up the same computer for $25 and getting a kid in the neighborhood to refurbish it.)
Windows this means you. Opening up a new window steals focus from my mouse to the new one. Opening a new window when I did not explicitly request it and while I am typing or navigating something in order to do something critical infuriates me.
Facebook this means you too. Adding new comments to the Newsfeed a tenth of a second before I click on a comment box means I click on the wrong one. It is the kinds of thing that will drive people like me to Google+.
My coworkers will thank you too for me not discovering creative new obscenities to describe your products.
One of the questions we ask our clients initiating an engagement to help them setup external authentication from our LMS to their server is, “What is the certificate authority for your SSL certificate?” We have been burned by people purchasing certificates from authorities Java does not support. (And the support is indeed limited compared to say, Mozilla.)
We were given the name of an intermediate certificate which set off warning klaxons. There are none of these in the cacerts file, the list of root CAs Java uses.
So the clients setup to test. Failures. The error:
javax.naming.CommunicationException: hostname.domain.tld:port [Root exception is javax.net.ssl.SSLPeerUnverifiedException: peer not authenticated
From what I was able to find, the error meant the certificate was not understood. Framed into thinking the intermediate CA was the cause I started looking at how to make it work. The two potential routes were get the client to add the intermediate CA to their server or test ways to complete the chain by adding the intermediate to my client.
Amy suggested looking at the certificate on the foreign server by connecting with openssl to get a better idea where it said there was a problem. The command looks like:
openssl s_client -connect hostname:port
The return was pretty clear that it could not understand or trust a self-signed certificate. The “i:” in the last line below is the Issuer. This made it clear the certificate was not signed by the intermediate CA we were told. It was a self-signed certificate. Doh!
verify error:num=20:unable to get local issuer certificate
verify error:num=27:certificate not trusted
verify error:num=21:unable to verify the first certificate
It is clear I need to make checking the certificate on the foreign host part of the standard practice. Did some spot checking of previous setups to test against LDAP and every one has a good certificate chain.
With a black box system a person working with it sees what goes in and what comes out. The machine’s decision making process is obfuscated. Theories are made based on incomplete evidence on the behavior. More data points on more situations confirming the behavior is my way of being more comfortable the theory is correct. Sometimes we lack the time or conscientiousness or even access to ensure the theory is correct. This leads to magical thinking like labeling the software in human-like terms, especially insane or stupid or seeking revenge.
With a white box system, a person working with it can see the machine’s logic used to make decisions. Theories can be made based on more complete evidence due to investigating the code to see what it is intended to do. The evidence is far more direct than testing more.
Systems today are so complex they tend to have many parts interacting with each other. Some will be of each type.
Then there are Application Programming Interfaces (APIs) which expose vendor supported methods to interact with a black box by disclosing how they works.
Proprietary systems tend towards a black box model from the perspective of clients. This black box philosophy depends on the experts, employees of the company, design the system so it works well and resolve the issues with it. So there is no need for clients to know what it is doing. Where the idea breaks down is clients who run the systems need to understand how it works to solve problems themselves. Sure the company helps. However, the client will want to achieve expertise to manage minor and moderate issues as much as possible. They want to involve the vendor as little as reasonably possible. Communities arise because peers have solved the client issues and getting an answer out of the vendor is either formulaic, inaccurate company line, or suspect. Peers become the best way to get answers.
Open source systems tend toward a white box model from the perspective of clients. This white box philosophy depends on clients to take initiative figuring out issues and solutions to resolve them. Clients become the experts who design the system so it works well. Where the idea breaks down is some clients just want something that works and not to have to solve the problems themselves. Sure the open source community helps. Companies have arisen to take the role of the vendor for proprietary systems to give CIOs “someone to yell at about the product”. Someone else is better to blame than myself.
Cases of both the black and the white box will be present in either model. That is actually okay. Anyone can manage both. Really it is about personal preference.
I prefer open source. But that is only because I love to research how things work, engage experts, and the feel of dopamine when I get close to solving an issue. My personality is geared towards it. My career is based around running web services in higher education. Running something is going to be my preference. (Bosses should take note that when I say not to run something, this means it is so bad I would risk being obsolete than run it.)
This post came about by discussing how to help our analysts better understand how to work with our systems. It is hard to figure out how to fix something when you cannot look at the problem, the data about the problem, or do anything to fix it. So a thought was to give our analysts more access to test systems so they get these experiences solving problems.