A friend posted an article about a political candidate. I wanted to find a related article and add it to that conversation. So, I opened the browser, got pulled into a couple offline things, found the article, got pulled in a couple other offline directions, and went back to the post. The app refreshed and went to the top of the newsfeed.
Now, I cannot seem to find the original.
Guess I should stick to the browser where everything is in tabs so I don’t lose my place.
Saw a friend ask a question about how to setup a store for WordPress on Facebook. This seemed trivial. Just look on Google for the answer.
But, then I had a thought:
What if my working with WordPress for years yields me better results?
First, I certainly have terminology that would give me better results. Like, I would search for “wordpress ecommerce plugin”. But, even if I searched for “best wordpress store”, I bet Google would turn over to me better results than a n00b who is just getting started using it.
Perhaps personalization has forced us to come full circle where people more than ever need to ask the question of their friends to get a better answer?
Looks like the storm of visitors to this blog looking for information on that fake video circulating Facebook is over. Most of the searches were for the hostname of the server which I happened to mention in the post. Which, I guess put me to the top of the search results.
One individual found me on Facebook and accused me of being the creator of the video because I mentioned it on my blog. Of course, I had her read the blog post for help addressing her account to getting the hacker’s session kicked out and securing it.
This made me wonder about the possibilities of a better model.
Fifteen years into the Facebook era, it’s well established that people aren’t actually friends with the hundreds or thousands of Facebook friends they may have. They couldn’t be if they tried—research has found that there seems to be a limit to the number of social connections a human brain can manage. Robin Dunbar, an anthropologist at the University of Oxford, is the most famous proponent of this theory, and his estimate of 150—known as “Dunbar’s number”—is often cited as the (approximate) number of casual friends a person can keep track of. There are different Dunbar numbers for different levels of closeness—concentric circles, if you will. The smallest circle, of five friends, consists of someone’s most intimate friendships. One can keep track of 15 close friends, and 50 pretty close friends. Expanding out from the 150 casual friends, this research suggests that the brain can handle 500 acquaintances, and 1,500 is the absolute limit—“the number of faces we can put names to,” Dunbar writes.
I’ve mentally categorized them as:
Must Friends (support clique) : 5 people : a best friend, a member of your inner circle, a person you count on when something big happens in your life
Trust Friends (sympathy group) : 15 people : a friend who shows integrity, someone you feel comfortable with, that you’re always glad to see, but not in your inmost circle; perhaps someone you’d like to be closer to if you had the time or opportunity
Rust Friends (close friends) : 50 people : a person you’ve known for a long, long time; you’re probably not going to get any closer to that person unless something changes, but a part of your life
Just Friends (casual friends) : 150 people : a person you see — at a weekly poker game, at your child’s school — who is enjoyable company, but you have no desire to socialize outside a specific context or to get to know that person better
Acquaintances : 500 people
Facial Recognition : another 780 (bringing total up to 1,500)
The Facebook algorithm is already looking for how much we engage with individuals in order to decide which content to show us on the Newsfeed. By deciding which people are important to us, they are in effect, modeling the Dunbar theory for us. Just in the shadows without allowing us to veto or decide on it. Well, sort of, we have the options for “Close Friends” and “Acquaintances” which seem to be taken from Dunbar albeitly at the wrong levels.
It seems plausible that Facebook could formalize the model further by just adding three more levels. They could automatically mark people based on their interpretation of our behavior with the person. And then also allow us to override it by changing the mark. That could help Facebook understand our idealized state of the relationship to better improve the Newsfeed. People leave the service because of frustrations about what they see. For some, that is too much about acquaintances and not enough about close friends. (The algorithms are showing unwanted content based on misunderstanding the individual, who doesn’t understand how to like the correct things to optimize the Newsfeed.)
Then again, I am probably one of the few Homo Roboticus using social media who would appreciate this. Most people probably would find it overwhelming.
Got a message from a coworker that suggested I was in a video. Naturally, I am supposed to click on it, but it felt wrong. A quick Duck Duck Go search revealed it to be a virus.
If you think a virus was installed on your device, then my advice is to find a trusted anti-virus software to scan your computer. There are also malware apps to scan & protect your phone. Some carriers offer them for free.
Some reports suggest if you click on it, then you get a Facebook login page.
Only, it is not a real one and designed to capture your credentials. That gives another party your credentials so that they can:
send this out as messages to your contacts
capture more information from your account
If you fell for the 2nd login issue, then my advice is to:
Immediately change your password.
Kick off all sessions in the “Security and Login” page. There is a “Log Out Of All Sessions” option.
Also in the security section, setup two-factor authentication.
Turn on getting alerts about unrecognized logins.
Of all the things I can report, I cannot report this?
It seems like Facebook should be able to detect this virus or phishing by now. What I can see of the link goes to a Facebook server: si-chao.cstools.facebook.com So, at least the link to virus/phishing is on their servers enough that they could check for its presence.
The person who sent it me says the account was locked out for 24 hours for behaving suspiciously. The act of sending hundreds of messages in a few seconds alerted Facebook to automated behavior. So, these are accounts they could be checking for being compromised.
Sociology has a concept of us holding multiple social roles. At home, I am both a husband and a father. With relatives, I am a son, nephew, or cousin. At work, I am a supervisee, mentor, subject matter expert, or organization historian. Things get a bit more undefined out in the wider world, but I hold social roles out there too.
Each of these social roles vary in the expectations of behavior. So, our behavior may vary depending on which role we are occupying at a given time. And, even more interesting is when we have to juggle multiple social roles AT THE SAME TIME for the first time. The more experience we attain at doing something, the better we get at figuring out the constraints and minefields in a situation.
The human brain devotes a large amount of processing to managing the information about the behavior of others to determine trust. And also ensuring our own behaviors are trustworthy. (You’ve read my prior stuff on Dunbar, right? 1, 2)
Perhaps part of the stress inducing nature of social media is the mixing of these social roles? A giant social network like Facebook means having a variety of relatives, coworkers, and friends mixing in the same spaces. People who come from different backgrounds, political viewpoints, education levels, interests, and levels of restraint. Navigating all this probably generates a ton of stress.
If so, then we need more segmentation.
Limit coworkers to more work appropriate social networks like LinkedIn.
Join topic groups and post content related to it there. To talk about politics, join groups that discuss it. (Be careful to avoid echo chamber groups.)
A private place to discuss more openly with friends. Maybe a private twitter account, a private Facebook group, group chat, etc.
A private place to discuss more openly with family.
Dear Facebook, it would be awesome if you would create a spoilers option for posts where the poster could say what it contains.
You get users feeding you data about engagement with media useful for advertisers.
Nice people could contain the damage of spoilers.
As it is, I saw several people created a post and put the spoiler in the comment which Facebook showed to me in the preview. So, people get spoiled inadvertently by people not intending to do so. A person trying to not spoil others has to create a post that says the content contains spoilers, create a spoiler-free comment on it, and reply to that comment with what contains the spoilers. Pretty cumbersome and other commenters might not get it and accidentally put a spoiler comment by not replying to the spoiler-free one.
Another approach Facebook might be to do is something similar to Twitter which has “muted keywords.”. The person seeking to avoid them can enter what they are trying to avoid and anything with that gets disappeared. There is a Tumblr XKit browser extension that operates similarly by collapsing the post into a message that says it is hidden because it contains the keyword. The XKit method is nice for TV shows because I do not have to add and remove each week.
It boggles the mind that we are in 2019 and this has not yet been solved by the social media giants such that we are still relying on 3rd party products that try to help. These are Facebook versions of XKit that work on desktop browsers and are no help inside the Facebook app.