The last few weeks in the UK has seen headline after headline talking about the grades given to students as they look to go to University, College, or hopeful of finding that dream job.
The headlines were more to do with the algorithm used in order to be able apply an additional data point and sometimes override teacher grade logic.
The results were not only deemed unfair. but also shone yet another light on a government that just seems to bumble from one 'reactive' catastrophe after another.
In many ways, social-media platforms are simply giant algorithms.
For those that have been long term shoppers on 'Amazon' you can't help but notice the algorithm that suggest 'people who bought this also bought that' - its a numbers game that has been mathematically been to work.
Back in the day this was called 'propensity modelling' by mail order and insurance companies to help them better understand risk along with what and who to send out re-mails to.
Great brands work extremely hard to 'draw' the people to them who most 'connect' with the brands personality - to do this they use hard data, and soft data.
In the social space most of the platforms operate a similar algorithm - including Linkydink.
The algo will feed your bias, whatever that might be.
When you start to 'like', share, or comment on something you get fed more of the same.
If you find yourself in a state of heightened anxiety chances are you are feeding that anxiety by subconsciously telling the algo want you want to see.
The good news is if you are conscious about this, and open minded you can flood your social feed with 'happy news' only.
But these algorithms can go seriously wrong. They have been proved to push people towards hateful and extremist content. Extreme content simply does better than nuance on social media. And algorithms know that.
Facebook's own civil-rights audit called for the company to do everything in its power to prevent its algorithm from "driving people toward self-reinforcing echo chambers of extremism".
And last month we reported on how algorithms on online retail sites - designed to work out what you want to buy - were pushing racist and hateful products. (link below)
So, here's an experiment that might just cheer you up;
For the next 3 days (on any social network) try commenting on content you find makes you smile, or improves your existing knowledge of a subject matter you have an interest in - scroll on past anything that makes you annoyed or uncomfortable.
Then see how you feel.
Whether it's house, car, health or any other form of insurance, your insurer has to somehow assess the chances of something actually going wrong. In many ways, the insurance industry pioneered using data about the past to determine future outcomes - that's the basis of the whole sector, according to Timandra Harkness, author of Big Data: Does Size Matter. Getting a computer to do it was always going to be the logical next step. "Algorithms can affect your life very much and yet you as an individual don't necessarily get a lot of input," she says. "We all know if you move to a different postcode, your insurance goes up or down. "That's not because of you, it's because other people have been more or less likely to have been victims of crime, or had accidents or whatever."