A friend posted an article about a political candidate. I wanted to find a related article and add it to that conversation. So, I opened the browser, got pulled into a couple offline things, found the article, got pulled in a couple other offline directions, and went back to the post. The app refreshed and went to the top of the newsfeed.
Now, I cannot seem to find the original.
Guess I should stick to the browser where everything is in tabs so I don’t lose my place.
Looks like the storm of visitors to this blog looking for information on that fake video circulating Facebook is over. Most of the searches were for the hostname of the server which I happened to mention in the post. Which, I guess put me to the top of the search results.
One individual found me on Facebook and accused me of being the creator of the video because I mentioned it on my blog. Of course, I had her read the blog post for help addressing her account to getting the hacker’s session kicked out and securing it.
This made me wonder about the possibilities of a better model.
Fifteen years into the Facebook era, it’s well established that people aren’t actually friends with the hundreds or thousands of Facebook friends they may have. They couldn’t be if they tried—research has found that there seems to be a limit to the number of social connections a human brain can manage. Robin Dunbar, an anthropologist at the University of Oxford, is the most famous proponent of this theory, and his estimate of 150—known as “Dunbar’s number”—is often cited as the (approximate) number of casual friends a person can keep track of. There are different Dunbar numbers for different levels of closeness—concentric circles, if you will. The smallest circle, of five friends, consists of someone’s most intimate friendships. One can keep track of 15 close friends, and 50 pretty close friends. Expanding out from the 150 casual friends, this research suggests that the brain can handle 500 acquaintances, and 1,500 is the absolute limit—“the number of faces we can put names to,” Dunbar writes.
I’ve mentally categorized them as:
Must Friends (support clique) : 5 people : a best friend, a member of your inner circle, a person you count on when something big happens in your life
Trust Friends (sympathy group) : 15 people : a friend who shows integrity, someone you feel comfortable with, that you’re always glad to see, but not in your inmost circle; perhaps someone you’d like to be closer to if you had the time or opportunity
Rust Friends (close friends) : 50 people : a person you’ve known for a long, long time; you’re probably not going to get any closer to that person unless something changes, but a part of your life
Just Friends (casual friends) : 150 people : a person you see — at a weekly poker game, at your child’s school — who is enjoyable company, but you have no desire to socialize outside a specific context or to get to know that person better
Acquaintances : 500 people
Facial Recognition : another 780 (bringing total up to 1,500)
The Facebook algorithm is already looking for how much we engage with individuals in order to decide which content to show us on the Newsfeed. By deciding which people are important to us, they are in effect, modeling the Dunbar theory for us. Just in the shadows without allowing us to veto or decide on it. Well, sort of, we have the options for “Close Friends” and “Acquaintances” which seem to be taken from Dunbar albeitly at the wrong levels.
It seems plausible that Facebook could formalize the model further by just adding three more levels. They could automatically mark people based on their interpretation of our behavior with the person. And then also allow us to override it by changing the mark. That could help Facebook understand our idealized state of the relationship to better improve the Newsfeed. People leave the service because of frustrations about what they see. For some, that is too much about acquaintances and not enough about close friends. (The algorithms are showing unwanted content based on misunderstanding the individual, who doesn’t understand how to like the correct things to optimize the Newsfeed.)
Then again, I am probably one of the few Homo Roboticus using social media who would appreciate this. Most people probably would find it overwhelming.
Got a message from a coworker that suggested I was in a video. Naturally, I am supposed to click on it, but it felt wrong. A quick Duck Duck Go search revealed it to be a virus.
If you think a virus was installed on your device, then my advice is to find a trusted anti-virus software to scan your computer. There are also malware apps to scan & protect your phone. Some carriers offer them for free.
Some reports suggest if you click on it, then you get a Facebook login page.
Only, it is not a real one and designed to capture your credentials. That gives another party your credentials so that they can:
send this out as messages to your contacts
capture more information from your account
If you fell for the 2nd login issue, then my advice is to:
Immediately change your password.
Kick off all sessions in the “Security and Login” page. There is a “Log Out Of All Sessions” option.
Also in the security section, setup two-factor authentication.
Turn on getting alerts about unrecognized logins.
Of all the things I can report, I cannot report this?
It seems like Facebook should be able to detect this virus or phishing by now. What I can see of the link goes to a Facebook server: si-chao.cstools.facebook.com So, at least the link to virus/phishing is on their servers enough that they could check for its presence.
The person who sent it me says the account was locked out for 24 hours for behaving suspiciously. The act of sending hundreds of messages in a few seconds alerted Facebook to automated behavior. So, these are accounts they could be checking for being compromised.
Dear Facebook, it would be awesome if you would create a spoilers option for posts where the poster could say what it contains.
You get users feeding you data about engagement with media useful for advertisers.
Nice people could contain the damage of spoilers.
As it is, I saw several people created a post and put the spoiler in the comment which Facebook showed to me in the preview. So, people get spoiled inadvertently by people not intending to do so. A person trying to not spoil others has to create a post that says the content contains spoilers, create a spoiler-free comment on it, and reply to that comment with what contains the spoilers. Pretty cumbersome and other commenters might not get it and accidentally put a spoiler comment by not replying to the spoiler-free one.
Another approach Facebook might be to do is something similar to Twitter which has “muted keywords.”. The person seeking to avoid them can enter what they are trying to avoid and anything with that gets disappeared. There is a Tumblr XKit browser extension that operates similarly by collapsing the post into a message that says it is hidden because it contains the keyword. The XKit method is nice for TV shows because I do not have to add and remove each week.
It boggles the mind that we are in 2019 and this has not yet been solved by the social media giants such that we are still relying on 3rd party products that try to help. These are Facebook versions of XKit that work on desktop browsers and are no help inside the Facebook app.
When people post a link, a Facebook bot looks at the content and finds the content of the <title> tag and creates a summary. My modest proposal is that it also locates the post datestamp to include here.
Every Facebook post has the name of the poster with when they posted it. It might be “Just now” to minutes or hours then if more than a day, the date. Then if more than a year, how many.
If Mark posts an article from 2 years ago right now, then it can appear fresh and new. Facebook also scrubs URLs so that if that indicated the publication date, one must click through to know that it is old. And, we all know in general people re-share things without doing such due diligence. This could be part of why missing persons posts get shared years after the person was found as people have no idea that the article is 1-10 years old without clicking through.
All social networks became popular because of trivialities. “What’s on your mind?” THAT is what we want. Users flocked to them because of trivialities. We want gossip, random, and meaningless.
Corporations need to monetize somehow. Ads are how social networks try to do so. Facebook showed that targeting ads by getting numerous attributes about us is the way to make the most money on it. Tumblr, for example, has completely inane ads that only get clicked by accident because ever couple posts presented is an ad. Instagram has almost as many ads as Tumblr but the targeting of Facebook.
Tribe, Friendster, and Myspace died because users left. The triviality was lost, so there was no reason to stay. Something I find fascinating is Facebook survived several of the exodus movements. Not enough people left to kill it.
I wonder if Facebook, Instagram, Twitter, Snapchat, etc are capable of dying in the modern era. Will enough people leave to cause an exodus movement?
Yes, Google+ was killed, but it died because it never made it into the user consciousness. I suspect that is because Google tried to make it the cornerstone of their ecosystem. It would be like Microsoft creating a social network around Office. Productivity tools do not a social network make.
Here is the thing. Taking away that permission makes Facebook unusable as no one can see them even people you want to see them. If Facebook cannot use them, then it cannot show them to others on your behalf.
I think Facebook should start:
Programmatically look to see if these statuses are posted by a user.
Disable access to photos and status updates for any user who has posted it and not allow them to make new ones.
Let them see the posts of others who have not posted it.
Highlight to the user that no one can see their stuff due to having that post. Give them the option of deleting the post to restore access.
My guess is if Facebook did this, then these posts would disappear from Facebook pretty quickly.
A legitimate message expressing concern about your impersonation account would:
Ask if you created another account.
Provide the address to the new account so you can go to the profile, click the three dots on the cover photo, select Report, and follow the instructions for impersonation.
Instead, the hot hoax right now says:
Hi….I actually got another friend request from you which I ignored so you may want to check your account. Hold your finger on the message until the forward button appears…then hit forward and all the people you want to forward too….PLEASE DO NOT ACCEPT A NEW friendship FROM ME AT THIS TIME.
Let’s break this down.
First, we have the preying on a fear we all have about our Facebook accounts getting hacked. Worse, this “hacker” is now going after friends.
But, the recommendation makes no sense at all. “Hold your finger on the message until the forward button appears…then hit forward and all the people you want to forward too…”
Forwarding the message to others is how chainletters operate. You are being played by forwarding it. You are spreading fear. You are not helping.