To better understand the possibilities of big data, Rabobank started with a multidisciplinary team and a few proof of concepts.
See on www.bigdata-startups.com
For some this is a more daunting task than others. A fellow Factualite, for example, explained to me the predicament of having to acquire gifts for his young nephew, parents, brother, and wife. To cater to the widely varying desires of these different people, he must make the time to visit a number of different stores. Especially when it comes to picking out things like jewelry for your wife, we agreed it was necessary to visit brick and mortar stores instead of simply shopping online.
See on blog.factual.com
We quickly stood up a Palantir instance and are actively integrating and structuring OCHA reports, MapAction updates, various open source data sets relevant to the area of operations (OpenStreetMap, local government info, supply centers, shelters, markets, etc.), SITREPs from various relief organizations, assessment reports generated in the field, and more. By providing access to many different kinds of data, the most recent data, and helpful ways of interacting with that data, we hope to help Team Rubicon and Direct Relief better coordinate their actions with other NGOs and provide more relief to more people, faster.
See on www.palantir.com
Traditionally, transforming data, even at a small scale, is a cumbersome process that often takes much longer than the actual work of analyzing it. This problem is exacerbated in a big data environment, where there’s more data to work with and where new formats can make it harder to clean up. Trifacta’s software is currently in use by beta customers,
See on www.regrit.com
We all want to reach a larger audience while stillmaintaining a connection and relationship with that audience. We can do this through glocalisation — the process of thinking and acting locally, while appealing to a global audience — in our content and social media strategy.
See on blog.hubspot.com
[W]hat if you want to schedule both memory and CPU, and you need them in possibly different and changing proportions? If you have 6GB and three cores, and I have 4GB and two cores, it’s pretty clear that I should get the next container. What if you have 6GB and three cores, but I have 4GB and four cores? Sure, you have a larger total number of units, but cores might be more valuable. To complicate things further, I might care about CPU more than you do.
See on blog.cloudera.com