UKIP (and brexit?) is driven by perception of immigration, not actual immigration.
What is the best classifier in machine learning? One paper suggested that we should all just go to random forests straight away. But then, maybe not. So… do whatever you were doing before? Anyway, thanks to this I learned about XGBoost.
When you eat a dried fig, you’re probably chewing fig-wasp mummies, too. Learn everything you can about figs and wasps and fig wasps!
Do animals fight wars and if so what was the largest war? This is the story of Argentine ant and its continent-spanning supercolonies.
The NIH postdoc salaries are out for the year and you can see the Obama pay raise! …if you are a 0- or 1-year postdoc, that is.
Here is the Standard Model of physics in one equation (click through for an explanation):
Are markets efficient? A good discussion between Fama and Thaler. Unsurprisingly, a lot of it comes down to semantics.
Did the human brain evolve to speak multiple languages? I think the historical evidence is a resounding yes.
So, why indeed, did purely supervised learning with backpropagation not work well in the past? Geoffrey Hinton summarized the findings up to today in these four points:
Our labeled datasets were thousands of times too small.
Our computers were millions of times too slow.
We initialized the weights in a stupid way.
We used the wrong type of non-linearity.