posted 10 days ago on gigaom
Cloud computing provides on-demand access to compute resources and advanced capabilities in a cash-flow-friendly delivery model. As the demand for greater corporate flexibility increases, cloud computing solutions are finding homes within numerous data-management ecosystems because organizations can quickly provision and scale resources to meet shifting customer demands. With advanced technology and lower capital expenditures, cloud computing allows IT departments to better serve users, and the technology delivered can match new workloads and bigger data sets on the most advanced platforms. In the past, enterprises mostly focused on structured information stored in relational databases, but with new technologies, all data — including social, machine, sensor and other data — can be incorporated into analytic processes to provide greater insights and predictive capabilities. Now large-volume, high-velocity and multi-structured data that was too complex or expensive for companies to collect and use can be effectively analyzed in a cloud environment. As a recognized leader in cloud computing, Teradata has flexible cloud solutions that deliver measurable business results. To learn more about the benefits of cloud computing and Teradata’s flexible array of cloud computing solutions, read Paul Barsch’s insightful article, “The Perfect Fit.”

Read More...
posted 10 days ago on gigaom
Massachusetts Governor Deval Patrick plans to propose legislation that will curtail the use of non-compete agreements by high-tech companies trying to keep employees from joining rivals. The move has drawn fire.  Hopkinton-based EMC which has used non-compete agreements and other legal tactics against employees who move to competitors, opposes this move. Patrick wants Massachusetts to join California and other states that have adopted the Uniform Trade Secrets Act, which purports to prevent workers from taking employers’ intellectual property to other businesses. Story posted at: bostonglobe.com To leave a comment or share, visit: Mass. governor declares war on non-compete pacts

Read More...
posted 10 days ago on gigaom
Kinematics CEO Matthias Bürger snaps block after block together. Within 30 seconds, he has formed them into a racecar. He pulls out a remote control app on a tablet and begins directing the car to zoom forward and backward. A remote control car built out of TinkerBots modules. Photo by Signe Brewster. The car is the result of Kinematics’ dream to make a toy that is as easy to use as Legos, but connected. The collection of snap-together TinkerBots modules range from wheels to motors to adapters that let you add on regular Legos. TinkerBots kits are available for the first time today on Indiegogo, where they are selling for $159 to $499. Kinematics CEO Matthias Bürger. Photo by Signe Brewster. “You can really build anything with it,” Bürger said. “There’s no other toy that can…learn like this.” During a demo, Bürger created a little crane for a Lego man, an ant and a dog. Each model relies on a red cube known as a Power Brain that acts as the brain for the toy. It attaches to other modules that bend, twist or just sit still, allowing you to create whatever you want. Bürger said Kinematics is working on rotors to turn TinkerBots into drones, plus sensors that allow them to avoid obstacles or interact with light. At some point, the company is considering adding tiny solar panels that would put children in touch with their energy usage. Kinematics plans to provide a remote control app that is compatible with TinkerBots toys, but you can also directly teach them a motion. If you hit the record button on the Power Brain, move a TinkerBot and then press the play button, it will repeat the motion back to you. It’s a quick and simple way to make a robotic dog walk or ant crawl. Adapter plates make TinkerBots compatible with Legos. Photo by Signe Brewster. If you are interested in programming a robot to do more complex tasks, TinkerBots are also Arduino compatible. Bürger said Kinematics wanted to make sure TinkerBots are entertaining to both children and adults, and grow with a child as they grow more experienced with robots. “Kids don’t just build the model you show them. They use their own creativity and come up with something totally different,” Bürger said. “You can’t believe what they come up with.” Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.A look back at this year’s CES and what it means for tech in 2014How companies can grow by moving into newer, bigger marketsRetail’s reinvention: technology’s impact on today’s supply chain

Read More...
posted 10 days ago on gigaom
Even as web=scale companies push their infrastructure to accommodate the looming internet of things, big, established enterprises are moving to cloud in a big way. Just a few years ago, companies like Netflix and Zynga were the poster children of cloud adoption while traditional, older enterprises still seemed to view cloud, especially public cloud, like it had cooties. What a difference a few years makes. If you needed proof that cloud is becoming the computer for big, older businesses, companies with security concerns and compliance worries, there have been a few bellwethers. The CIA selected Amazon Web Services, the public cloud giant, to build its own secure private cloud. General Electric, about as blue-chip an American conglomerate as you could find, invested $105 million in Pivotal,  which is banking that big customers are finally ready to rip and replace existing gear in order to capitalize  on the data generated and aggregated that the internet of things will bring to fruition. That means a ton of investment in search of big returns. Researcher  IHS expects enterprises to spend $235 billion on cloud architecture and related services by 2017, up 35 percent from the $174 billion it expects to be spent this year and triple the $78 billion spent in 2011. The acceptance of cloud as the go-to model for big companies will be a theme at Gigaom’s Structure show in June where we’ll talk to IT pros from Domino Brands, The Gap — and yes — GE about their cloud decisions and IoT plans.     Rodney Rogers, CEO and Chairman of Virtustream, the Bethesda, Md.-based enterprise cloud provider, told me that trying to sell cloud to the enterprise in 2009 when the company was founded “was like pushing a boulder up hill with a piece of string. RFPs [requests for proposals] were based more on morbid curiosity than anything,” he said in a recent interview. That started to change in earnest in 2012, when more companies started to understand that “cloud” itself was more than just server virtualization writ large. And enterprise adoption grew in subsequent years when companies ran the numbers on how much it would cost to renew their in-house hardware versus “replatforming” or moving those workloads on a private, or even public cloud. Cloud First helped Gigaom Research analyst Ashar Baig said the U.S. government’s “Cloud First” mandate in 2011, which pushed agencies to use cloud to streamline operations, cut costs and improve accessibility, was a huge driver. “The U.S. government is the biggest IT customer in the world and when it said 45 percent of all government data had to be in the cloud by 2014, people listened,” Baig said. Google SVP Urs Hölzle   And if there were any remaining doubt that big-time customers want their products to come in cloud form, just look at recent announcements from the legacy IT vendors: SAP, Oracle, Microsoft while building out their own clouds are also offering most of their software goodies on AWS as well. If anyone had predicted this even 3 or 4 years ago, no one would have believed them. So a few threads — the IoT is spreading like kudzu and big enterprises are updating their infrastructure, largely through cloud adoption, in a way that should enable them to capitalize on that in the not-too-distant future. And don’t forget, the top cloud execs from the major cloud providers – Amazon’s Werner Vogels; Google’s Urs Hölzle; HP’s Bill Veghte; IBM’s Lance Crosby; Microsoft’s Scott Guthrie; Rackpace’s Taylor Rhodes; and VMware’sBill Fathers will also be on hand at Structure 2014 to talk about the enterprise cloud. Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.Defining Internal Cloud Options: From Appistry to VMware2012: The Hadoop infrastructure market boomsThe Structure 50: The Top 50 Cloud Innovators

Read More...
posted 10 days ago on gigaom
Which clouds are doing well in the enterprise? RightScale has a unique perspective and VP of Marketing Kim Weins gives us a glimpse into what’s going on here. Amazon Web Services keeps rolling along which is no surprise. More eye opening was that RightScale’s 2014 State of the Cloud survey showed VMware’s vCloud Hybrid Services (vCHS) came in second, surprising since it’s been out less than a year. My bet, confirmed by Weins, is that some of the many, many VMware shops out there confused vCHS with vSphere and vCloud Director, etc. “We call this vSoup,” she said. All joking aside, VMware has near 100 percent penetration in enterprise accounts so anyone doubting that VMware has a shot in cloud should think again. Ditto Microsoft Azure. Of course it depends on how many of those customers want to deepen their dependence on those vendors going forward. And Derrick Harris talks about the use of big data in agriculture — an important trend given the necessity of wringing the most food out of stressed resources. And we discuss Amazon’s latest management improvement — its new Cost Explorer and it’s possible impact on the Amazon ecosystem. SHOW NOTES Hosts: Barb Darrow and Derrick Harris Download This Episode Subscribe in iTunes The Structure Show RSS Feed PREVIOUS EPISODES: Cloudera CEO breaks down Intel investment Don’t like your cloud price? Wait a second Pro tip: how to learn to love the cloud (and not worry when it shoots you through the head) How Delcara brings together cloud, AI and social networkst ot help us learn better The state of private cloud  Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.The Structure 50: The Top 50 Cloud InnovatorsHow OpenStack can become an enterprise reality.What you missed in cloud in the third quarter of 2013

Read More...
posted 11 days ago on gigaom
NASA knows that liquid hydrogen fuel and lithium batteries can’t pack enough power to send ships and astronauts record distances from Earth. The agency announced today that it will offer up to $250,000 for battery alternatives it can use for Earth and deep-space missions. The funds come as a part of NASA’s plans to invest over the next 18 months in technologies “that address several high priority challenges for achieving safe and affordable deep-space exploration,” associate administrator for space technology Michael Gazarik said in a statement. NASA expects to choose roughly four proposals divided into two categories: improvements on battery features like cell chemistry packaging and cell integration, and technologies that surpass lithium batteries’ theorized limits altogether. The Department of Energy plans to pour $120 million into U.S. labs, universities and private companies between 2012 and 2017 for battery development. And while Argonne National Laboratory–the effort’s hub–has already unveiled several promising battery developments, the competition will also be an interesting opportunity for private firms to get involved. Elon Musk’s Tesla and SpaceX, for example, have poured a lot of time and money into the lithium batteries that power their cars and rockets, and are undoubtedly looking into other technologies. My colleague Katie Fehrenbacher wrote last month that there is already a rich range of alternatives to batteries for energy storage. There are compressed air batteries and systems that pump water to the top of a hill and let it run back down to generate electricity. Some make much more sense for Earth than space, but they all show that energy storage is not just on NASA’s mind. Whatever battery alternatives are chosen will complement ongoing initiatives to reduce NASA’s reliance on liquid fuel, which is finite and expensive to carry into space. Some of its spacecraft already use solar energy. The Curiosity rover, which landed on Mars in 2012, carries a tiny nuclear reactor in its belly. And there are already proposals to turn the moon and other planets into fueling outposts, where robots could mine resources and refuel docked spacecraft.  Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.Is the 3D printing market a hype, a hope, or a threat?What defines the key players of the IaaS industryWhy going “green” matters for high-performance computing

Read More...
posted 11 days ago on gigaom
Longreads, which started with a Twitter hashtag and gradually evolved into a community of readers sharing longform fiction and journalism, said on Wednesday that it has been acquired by Automattic (see disclosure below), the for-profit company behind the WordPress.com blogging platform. WordPress founder and Automattic CEO Matt Mullenweg said that Longreads’ commitment to longform reading fits perfectly with his company’s desire to promote good content and that all four of Longreads’ employees would be joining Automattic. “The world cannot live on 140 characters alone,” Mullenweg told Bloomberg Businessweek. “Longreads embodies a lot of what we really value with Automattic and WordPress.” WordPress software — either the open-source or the paid-for version — is the publishing engine behind millions of websites and some major media entities, including Gigaom, and WordPress.com has also been trying to become more of a portal that points readers towards good content. Automattic vice-president Raanan Bar-Cohen said in a post on the WordPress blog that the two companies made for a good fit because “we are entering a new era for independent writers and publishers to embrace depth and quality, and WordPress.com is committed to empowering these creators.” So excited for what’s next with @Longreads and @wordpressdotcom. Thank you to everyone who has participated in the #longreads community— Mark Armstrong (@markarms) April 09, 2014 Mark Armstrong, founder of Longreads and a former journalist, said that the service would continue as it had before the acquisition, and that joining WordPress would allow the community to grow and expand. The site currently offers a free weekly email or a $3 membership that provides daily recommendations. In an email, Armstrong said that there’s a lot more that he wants to provide, and partnering with WordPress would allow him to do so: “The Longreads community, and the appetite for great storytelling, has grown so much since I first started the service five years ago. That said, I knew there was more that we could be doing to both serve readers and keep building a thriving ecosystem for independent publishers and writers. I first met Matt Mullenweg four years ago, and I have always been a huge fan of the mission and principles of Automattic and WordPress. It seemed like a perfect fit for Longreads to join up and explore new opportunities for readers, writers and publishers.” Disclosure: Automattic is backed by True Ventures, a venture capital firm that is an investor in the parent company of Gigaom.Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.Are Comments Facebook’s Next Big Service?What happened in social in the fourth-quarter 2013Why design is key for future hardware innovation

Read More...
posted 11 days ago on gigaom
Comcast revealed on Wednesday it has grown its Wi-Fi hotspot network to 1 million nodes. Considering that on Tuesday its FCC filing on its planned acquisition of Time Warner Cable listed 870,000 hotspots, it appears to be ramping up its wireless network quickly. Comcast can scale so quickly because its broadband customers are doing much of the heavy lifting. Its latest wireless home gateways all operate in dual modes, providing a private home network for the customer and a public network that can be accessed by any Comcast broadband customer. Comcast also offers public hotspot capabilities to all of its business customers and has built with thousands of high-powered outdoor hotspots in key high-traffic zones in its operating territory. Comcast’s hotspot network in the mid-Atlantic (source: Comcast) Comcast isn’t breaking out how many neighborhood hotspots it’s running versus commercial access points, but they make up the vast majority of its network. Comcast is part of the CableWiFi consortium, which pools together the outdoor and business hotspots of Time Warner, Cox Communications, Cablevision Systems and Bright House Networks. CableWiFi has 200,000 hotspots in total, meaning Comcast has more than 800,000 access points transmitting from living room shelves. Though its hotspot network is a considerable resource for Comcast’s customers, it’s not the easiest to use. Customers still have to log in to each hotspot using their Xfinity credentials, but emerging technologies like Hotspot 2.0 will eventually make those connections automatic. When that happens Comcast can turn its hotspot footprint into a kind of mobile data overlay offloading smartphone and tablet traffic off from cellular networks. Comcast told regulators it’s weighing using that footprint to create a Wi-Fi First mobile network, using cellular systems to fill in the gaps between its hotspots. It hasn’t revealed whether it would sell such a service to consumers to sell Wi-Fi capacity to other carriers.Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.What the shift to the cloud means for the future EPGWhat the shift to the cloud means for the future EPGThe living room reinvented: trends, technologies and companies to watch

Read More...
posted 11 days ago on gigaom
Think about being a hospital that wants to improve survival rates for patients. You have lots of data about patients — their medical histories, EKG readings, room numbers, doctors, billing information and much more — and you certainly know whether they leave alive or dead. Somewhere in all that data, the current thinking goes, there must be a formula that can predict what’s going to happen. It’s not so much a big data problem as much as it’s a complex data problem. According to Patrick Lilley, co-founder and CEO of an Aliso Viejo, Calif., startup called Emerald Logic, the real world runs on systems where there are inputs and outcomes, only the complexity of the data we’re generating makes it very difficult to find the inputs that will lead to the best outcomes. He equates it to sticking a marble in a black box, eventually getting it out the other side, and then having to diagram what you think the inside looks like. “The challenge there is you have to model what’s going in that system and you can’t often look inside,” he said. Lilley also claims his company can help you find the answer. The company’s software, called FACET (short for Fast Collective Evolution Technology), tests tens of thousands of algorithms against a dataset in order to find ones that represent the relationships between those data and the end result. He calls the process “evolutionary computing,” because they evolve, mate and migrate, and only the best one survives. “This is a monkeys-on-typewriters sort of problem,” he acknowledged, referencing those theories about how long it would take a group of primates to reproduce the complete works of Shakespeare. The software doesn’t know anything about the field it’s working on or have any presuppositions about what’s in it. It’s simply trying to predict one thing from another, and he says it’s pretty effective. FACET works by taking a sample of a dataset, generating tens of thousands of algorithms from it, and then testing them in order to determine the most-predictive one. “Because it’s evolution, it tends to wash away the variables and the math operators that are unimportant,” Lilley explained. “… No more than eight things have ever mattered in any model we’ve ever generated.” Once the process is complete, the algorithm is tested against new data in order to ensure its predictions are still accurate. Emerald Logic delivers FACET as a cloud service, so customers really only pay for the algorithm it produces. Customers own all the intellectual property associated with it, and Emerald Logic charges based on the economic value of the problem it’s trying to solve. A hot field for startups, actually All of this probably sounds a little too good to be true — and maybe it is — but Emerald Logic is really just putting a different spin on something that multiple startups are also pushing. There is BeyondCore, with its service for finding the variables most statistically relevant to a given outcome, and Emcien. There’s Ayasdi, which runs thousands of machine learning algorithms to discover and then visualize connections among massive datasets. Emerald Logic’s promise actually sounds similar to that of Nutonian, a startup from former Cornell Creative Machine Labs researcher Michael Schmidt that claims its Eureqa software can “calculate laws of physics” present in business data. Each approach runs into the question of whether anyone can trust some software to uncover what’s important in their data, but that’s not exactly the case. Once data scientists or business analysts see what the software has come up with, they can dive in and look at the variables, examine the connections, and figure out if they buy into it. They can run tests to determine if maybe there’s something there worth investigating further. “Artificial imagination” Besides, Lilley argued, he has proof that FACET works. The company did work with King’s College in London around identifying markers for Alzheimer’s disease and highlighted 14 out of a list of 11,000 possibilities. Half of them had already been mentioned on prior literature, a quarter had been thought of as possible markers and the remaining quarter were novel to FACET. It would have been easy to ignore what what the software found had it not validated those previous findings and inklings, Lilley said. According to a February 2013 press release announcing that partnership, “Using these markers, plus APOE genetic information and demographics, the collaborators produced a mathematical classifier of 94% accuracy in distinguishing Alzheimer’s study subjects from controls or with those with mild cognitive impairment.” In finance, FACET routinely finds that how a company incorporates is a strong predictor of whether it will succeed. In consumer loans, it has found that “effectively, liars tell longer stories,” Lilley said. And in fact, he noted, Emerald Logic is his third startup with the same co-founder and FACET is kind of just an iteration on the technologies of the previous two. The first, called Digital Transit, used a genetic algorithm to do over-the-air software updates for mobile phones. That company merged with Bitfone in 2001, which HP acquired in 2006. In January 2014, Qualcomm bought the associated patents from HP. The second startup, called Deep Six Technologies, generated decision trees based on data about email servers in order to do spam detection. The two founders have been working on Emerald Logic since 2011. Whether or not FACET — or anything of its ilk — turns out to be a magic bullet, they’re all working under the same assumption that has driven the push of big data technologies and data science into the mainstream. Namely, that if data really does contain answers to tricky problems, there’s no way a person can figure out all the right questions to ask to find those answers among thousands of different variables. At some point, some parts of the process must be automated in order to steer people in the right direction. This is why Lilley refers to Emerald Logic and FACET as “artificial imagination” rather than “artificial intelligence. “The more expertise someone has in a field,” he explained, “the more they know better and the less they sort of look around. “… This method is pretty sideways. It’s not the way people are used to thinking about the problem.” Feature image courtesy of Shutterstock user phipatbig.Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.Why we must apply big data analytics to human-generated dataHow to use big data to make better business decisionsConnected world: the consumer technology revolution

Read More...
posted 11 days ago on gigaom
Apple’s vaunted design group appears to be going through some changes. According to a report from 9to5Mac, friction between head of software design Greg Christie and design SVP Jony Ive (pictured) has resulted in a shakeup that led to Ive taking control over all design activity related to software and hardware and will likely see Christie leave the company. Citing unnamed sources, 9to5Mac stated the two began clashing after Ive took over human interface design responsibilities in 2012 in a move that also saw the departure of former Apple software leader Scott Forstall. When Ive’s team took on the redesign of the iPhone’s interface to create iOS 7’s new look he came into conflict with Christie, who wanted to go in a different design direction, the report said. That clash led to Ive wresting to control of the project during the new OS’s development, according to 9to5Mac. Now everything design-related at Apple will fall under Ive, who is a legend within the design community for his role in developing the products that have Apple still raking in boatloads of cash. But Christie was a key contributor. According to a recent (and rare) interview granted to The Wall Street Journal, Christie — who first came to Apple in 1996 to work on the Newton, the report said — was deeply involved with the creation of iOS and the iPhone.Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.The rebirth of hardware demands new definition of designA demographic and business model analysis of today’s app developerWhat You Need to Know About the SoftBank-Sprint Merger

Read More...
posted 11 days ago on gigaom
Even judging by the low standards of creepy data-mining apps, “Brightest Flashlight” did something pretty egregious. The free app, which was installed by at least 50 million Android users, transmitted users’ real-time locations to ad networks and other third parties. It was, in other words, a stalking device disguised as a flashlight. In December, the Federal Trade Commission exposed the app’s antics and also announced a proposed settlement with the app maker, GoldenShores Technologies, a one-man operation based in Idaho. In doing so, the agency explained how Brightest Flashlight used legal flim-flam in a privacy policy and user license agreement to obscure what the app was up to. The terms are now final, and they’re underwhelming, to put it mildly. In a Wednesday announcement, the FTC confirmed that GoldenShores and owner Erik Geidl are not to collect app users’ geolocation without clearly explaining how and why they’re doing so and, in broad terms, say who is receiving that information. The flashlight app maker will also have to keep records for the FTC to inspect, and Geidl will have to tell the agency about any new businesses he decides to start in the next 10 years. On paper, the order looks like stern stuff but, in practice, it’s hard to see how this amounts to real punishment. Even though Geidl did something deeply unethical, compromising the privacy of tens of millions of people, he will not pay a cent for his misdeeds. The FTC said earlier that it didn’t seek financial restitution because the app was free. The agency’s justification is unsatisfying, however, because it doesn’t acknowledge that Geidl must have earned earned income by selling users’ geolocation. A better approach would have been to strip him of any profits he made through the app, and also name-and-shame the advertisers who bought the information from him. While it’s good that the FTC is helping to publicize the mischief of app makers, it’s unlikely that bad actors will take the agency seriously until it starts setting down real punishments on people like Geidl and the ecosystem that sustain them.Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.Consumer privacy in the mobile advertising era: challenges and best practicesConnected world: the consumer technology revolutionSector RoadMap: Content personalization in 2013

Read More...
posted 11 days ago on gigaom
As Wi-Fi starts making its way into more internet-of-things gadgets, connecting those devices to Wi-Fi networks is becoming a chore. These activity trackers, thermostats and cameras don’t necessarily have the user interfaces or even screens we would use to configure a Wi-Fi connection on our smartphones or PCs. The Wi-Fi Alliance is now trying to make those connections easier with the help of near-field communications (NFC). The Alliance has updated its Wi-Fi Protected Setup certification program to support NFC verification. Instead of entering a password or holding down buttons, you simply tap two Wi-Fi devices with NFC chips together to establish a connection. The technology can be used to connect devices to a local network by tapping a router, or two end-user devices by tapping them together. For example, I’ve been testing out Whistle’s dog activity tracker for the last few months, which uses both Bluetooth to connect to my phone and Wi-Fi to connect to home network. Connecting my Whistle to my home network is a multi-step task, requiring first pairing the gadget with my phone with Bluetooth and then configuring the device to connect to my Wi-Fi through Whistle’s smartphone network. Whistle is more useful the more networks it connects to, but if I wanted to add additional Wi-Fi networks to the device – say at my parents’ place or at the kennel — the owners of those networks would have to go through the same process. The Whistle canine activity tracker (source: Whistle) The new Wi-Fi Protected capability (and an NFC chip) would make Whistle connect instantly to the network over a secure WPA2 connection with a mere bump against the router. Of course, that’s assuming you want to give that kind of easy access to the world of internet-of-things devices. Wi-Fi Protected uses proximity as security, assuming if you can get close to a router or gadget, then it’s authorized to share connectivity. Not everyone wants their Wi-Fi networks — or devices — to be so open. A small startup called Pylon is exploring some interesting use cases for NFC-brokered connections in the home that may address some of those security concerns. It has developed a Wi-Fi beacon that creates a guest wireless network that can be accessed with an NFC tap or a “bump” of the iPhone (the accelerometers in the devices trigger the handshake). Instead of granting all network rights to those guest devices, Pylon could restrict users to internet access only and for a short interval, say 30 minutes. The Wi-Fi Alliance said it is now certifying devices using the new technology, and among the gadgets on its test list is Google’s Nexus 10 tablet. I wouldn’t, however, expect a huge flood of new gadgets using the capabilities. While NFC is making it into more and more smartphones, it’s still rare in devices like wearable and smart appliances. The goal of many these device manufacturers is to make their devices as inexpensive as possible, and adding an additional radio contradicts that trend. Still, there could be a lot of use cases for NFC-brokered connections in smartphones. Instead of trying to dig up passwords whenever a friend wants to connect to your home network, they could just tap to connect. And as Wi-Fi hotspots make their way into connected cars, Wi-Fi Protected could be a brilliantly simple way to connect a tablet to the in-car network.    Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.How new devices, networks, and consumer habits will change the web experienceAnalyzing the wearable computing marketHyperlocal: opportunities for publishers and developers

Read More...
posted 11 days ago on gigaom
Mailbox, the iOS email app that Dropbox acquired last March, is expanding to two new platforms — Android and Mac OS X. Mailbox for Android, available as of Wednesday, is an equivalent experience to its iOS cousin, but adds a new feature called “Auto-swipe.” When a user takes action on a particular email, like archiving a daily deals wrap-up, Mailbox will remember it and ask to automate that action for the future. Acting as a new filter, all Auto-swipe preferences will be synced across devices. Starting today, Mailbox uses Dropbox to sync preferences and Auto-swipe patterns across email accounts and devices,” the company said in a blog post. “That means you get a seamless experience no matter which device or email account you use.” The feature will be available in an update to the iOS app. But Mailbox for Desktop, released in limited beta for OS X only, is the more exciting product. The minimalistic interface remains true to the app’s mobile roots, and it ports many of the common gestures on Mailbox directly to a trackpad. This means that the app remains consistent across all platforms to keep user confusion low. As of now, no release date is set for the product. Mailbox for Android. Images courtesy of Dropbox.   Mailbox for Android and Mac was one of many announcements made at a Dropbox special event on Wednesday in San Francisco. The company also announced a new business platform, as well as photo-sharing app Carousel.Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.Survey: How apps can solve photo managementMobile Q1: All Eyes on Tablets, T-Mobile and AT&TFixing Fragmentation: Google’s Key to the Enterprise Tablet Space

Read More...
posted 11 days ago on gigaom
Dropbox has rewritten its popular namesake file-share application to be more enterprise friendly. The goal here to reassure businesses that Dropbox for Business is indeed to be trusted with corporate content and, oh by the way, get more companies to buy it as a result. “We rebuilt all of Dropbox to give everyone two Dropboxes; one personal with your password your contacts, and a second company Dropbox accessible to you but managed by company. But you don’t want it to be klugey and hard to go back and forth. Most of us have one phone so we had to reengineer our interfaces,” said Ilya Fushman, director of product, mobile and business for the San Francisco-based company, ahead of an event in the city designed to highlight the new features. Before, if Jane Doe had her personal Dropbox on her device and wanted to sign into the old Dropbox for Business, she really couldn’t do so without some sort of hack. The revamped Dropbox solves that problem by letting Doe, the individual, have that personal account and Doe, the employee, have an IT-managed business account accessible from the same device. And now, if Doe loses her device, corporate data can be wiped clean remotely. If she quits, IT can reassign business content to another authorized user — a key demand for Dropbox for Business up till now. Oh, and the new business version also gives IT a view into who opened what documents when. Such auditing is crucial to many companies. With the old setup, the difficulty of accessing both personal and work Dropbox from one device meant users often synched work documents with their personal accounts. That sets up the sort of data leakage scenario that give IT fits. In theory, the easy coexistence of accounts from one device will alleviate that problem. In addition, segregating content on the business accounts means that third-party vendors — like NCrypted Cloud and BoxCryptor — can focus on layering additional security on that data. If much of this sounds familiar it’s because these administrative capabilities are what Box and other would-be Dropbox-for-the-enterprise companies already offer. Box, for example, has offered remote wipe via partnerships with Good Technology and MobileIron for some time. What those vendors may not have is Dropbox’s gigantic name recognition among consumers. That brand already made the older Dropbox for Business an easy sell into small businesses, said Jim Turner, president of Hilltop Consultants, a managed service provider. End users want Dropbox and their bosses sign off on it because it’s a well-known name backed by a trusted partner, he said. At a special event Wednesday in San Francisco, the company also announced a collaboration tool called “Project Harmony,” designed to help remedy when two users are working on the same file. Deployed within native apps, starting with Microsoft Office for both Mac and PC, Dropbox pops up a small notification window that tells the user how many people and who are actually working on the file at the same time. The feature also includes a chat system, and live refresh so users can quickly update documents. Dropbox now claims 275 million (!) users, and that it is used in 97 percent of Fortune 500 accounts, but still does not break out paying customers. The company really needs to make it easier for people who bring Dropbox in from home to upgrade to a paid business account without sacrificing ease of use. This is just another step in that road. Dropbox for all its resources, still may have a tough row to hoe in business accounts where Microsoft and Google are making big plays OneDrive and Google Drive respectively.Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.How scribbling on an iPad makes your work life easierA look back at the third quarter of 2013Cloud and data first-quarter 2013: analysis and outlook

Read More...
posted 11 days ago on gigaom
On Wednesday Dropbox announced a batch of new consumer products for both mobile and desktop consumers. But Drew Houston, co-founder and CEO, saved the best for last: a photo-sharing feature that allows users to sync seamlessly through Dropbox called Carousel. “When I’m looking at the photos on my phone, I see photos I took with the phone, but I don’t see pictures from my computer or anything else,” said Drew Houston, co-founder and CEO speaking at a at a special event in San Francisco, “Instead of one shoebox, I have many shoeboxes.” Carousel is a stand-alone app that Houston said the company has been working on for years.  It works kind of like a Facebook Timeline, organizing photos into events and surfacing important photos to highlight each event.  The app doesn’t actually locally store any photos or videos on a phone — everything is in the cloud, but accessible at all times. Carousel also has auto-backup, syncing photos across platforms. Drew Houston debuts Dropbox’s new photo sharing app Carousel. Photo by Lauren Hockenson/Gigaom “Everywhere you are, you have every picture you’ve ever taken organized in a beautiful timeline,” Houston explained. “But not only that, you have every picture that’s been taken of you.” Photos can also be shared between users in an interface reminiscent of a chat window: Users can tap and quickly share batches of photos, and comment on them with a message and can share with both individuals and a group, and photos and video sent can sync directly to the user’s Timeline with a click. The announcement comes after company raised a $350 million funding round in February of this year. Dropbox now has more than 275 million users — a 75 million user increase from last year. Carousel is available today for iOS, Android.    Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.6 steps for scaling a startupPractical business content collaboration: Personal tools show the wayApplying lean startup theory in large enterprises

Read More...
posted 11 days ago on gigaom
Former New York Mayor and entrepreneur Michael Bloomberg said at the Bloomberg Energy Summit on Wednesday that there should be solutions and systems in place to help the people who have lost jobs because of closed coal plants and mines. Despite that he says he gives “a lot of money to the Sierra Club,” to help close dirty coal plants, as natural gas and clean energy projects come online, he iterated that as a society we have to “have some compassion to do it gently.” Subsidies to help displaced workers are one option, said Bloomberg, while re-training is another option. But, in a slight to the tech industry’s sometimes out-of-touch nature with workers outside of Silicon Valley, he said retraining needs to be realistic: You’re not going to teach a coal miner to code. Mark Zuckerberg says you teach them [people] to code and everything will be great. I don’t know how to break it to you . . . but no. The comment about Zuckerberg is just the latest indication of the backlash against the tech industry, which has developed a reputation for being filled with over-privileged, over-paid developers and execs that have little compassion for the struggles outside of the tech bubble. This reputation has culminated in protests against Google buses, and even against tech executives. Outside of the tech industry, there’s a significant energy shift happening across the U.S. Older coal plants are being closed, following the rise of cheap natural gas, as well as the EPA’s plans to regulate carbon emissions (coal is the dirtiest form of electricity generation). Between 2007 and 2012, coal’s contribution to the U.S. electricity supply went from 50 percent to 37 percent. In a unusual twist, some of the country’s coal mines are actually not being shut down along with the plants, but are sending coal to other quickly developing countries like India. However, few new coal plants are being built. New coal power only made up 10 percent of the total newly added electricity generation in 2013. At the same time, new natural gas made up almost half of the new electricity capacity, solar made up about a third of new electricity, and wind delivered about 7 percent of new electricity. But as coal plants and mines are closed, there will obviously be displaced workers in coal-heavy states that are already facing a suffering economy. This will have major implications on society, and we need to find solutions to help them, said Bloomberg.

Read More...
posted 11 days ago on gigaom
This is pretty cool: Some developers from the Telefonica-owned European mobile carrier O2 have been toying with the idea to use an Android smart watches as a Chromecast remote. The O2 Lab team first used Google’s newly-released Android Wear SDK to build a basic remote control app, and then ported the same app to Sony’s Android-powered smart watch – presumably because they wanted to run it on an actual device as well. Check out the video below: Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.The future of mobile: a segment analysis by GigaOM ProThe state of the converged-mobile-messaging marketForecast: the converged mobile messaging market

Read More...
posted 11 days ago on gigaom
As T-Mobile continues to chip away at its larger rivals’ business, it’s starting to scale down to mobile data plan prices to attract more budget-minded smartphone users. On Saturday, it will introduce a new entry-level plan priced at $40 a month and offering 500 MBs of data as well unlimited voice and SMS. This plan, called Simple Starter, is a bit different from its regular Simple Choice plans, which start at $50 a month. Instead of throttling data speeds back after customers hit their monthly data caps, T-Mobile is suspending data service after customers hit 500 MB in a billing cycle, restoring data access when a new billing cycle comes into affect. T-Mobile is positioning the plan as a way for cost-conscious consumers to avoid overage charges. Though T-Mobile technically doesn’t charge overages on any of its plans, Verizon and AT&T will automatically tack another data bucket onto your bill once you hit your cap. That said, T-Mobile is also providing an option for customers to buy data a la carte if they’re stranded mid-billing cycle without a data connection: a one-day 500-MB plan costs $5 and a seven-day 1-GB plan costs $10. Essentially if you’re a light data user, this plan make a lot of sense . You get full access to its LTE network and never worry about having your data speeds throttled. But if you’re creeping over 500 MB more than a couple of months of year, then it’s probably not worth your while. The cost of buying passes to maintain your service will obviate any cost savings over T-Mobile’s Simple Choice plans – unless you’re willing to restrict your mobile internet usage to Wi-Fi. Simple Starter also doesn’t give customers access to T-Mobile’s new free international data roaming and texting benefits, substantial perks for anyone who travels overseas. It’s designed as a domestic only plan. Still, it’s very interesting to see T-Mobile scale prices down — creating cheaper options for consumers — rather than just pile more data onto its existing plans, though its obviously restricting some features.We’re going to see more emerge from T-Mobile in the next few days as it tweaks its Un-carrier strategy. In a T-Mobile blog post, CEO John Legere said T-Mobile would be making a new announcement each day until Friday, so stay tuned.Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.Why LTE in the iPhone mattersGigaom Research predictions for 2014A look back at mobile in the third quarter

Read More...
posted 11 days ago on gigaom
It’s hard enough being the editor of a newspaper like the Washington Post at the best of times — and these are clearly not the best of times — but it has to be even harder to try and defend your decision to tell a star blogger like Ezra Klein to take a hike, especially after he has just launched a well-received site financed by a competitor. Post editor-in-chief Marty Baron did his best in a recent Q&A, but in doing so he provided even more reasons to doubt whether the Post is thinking clearly about its future and how to get there. Baron’s comments came after a recent speech to the International Symposium on Online Journalism in Austin, one in which he expressed optimism about the future of journalism, and mentioned that having journalists like Klein leave to start new things was a natural part of the process and not a sign that newspapers were dying, etc. “Saying you’re the ‘next big thing’ and hiring a bunch of people is the easy part,” he said. In the question period that followed, Baron addressed Klein’s departure more directly — here’s an excerpt of what he said, as transcribed by Nieman Journalism Lab editor Josh Benton: “I think people have been left with the impression with the coverage that somehow he was trying to do this within the umbrella of The Washington Post, and that’s just simply not the case. What Ezra said when he came to senior executives at the Post — and I was the first one he came to, as far as I know — was that he wanted to create an entirely new news organization, something entirely separate from the Post. And that he would be in charge of it — he would be the president, the CEO, the editor-in-chief, he would select the technology, he would select the advertising chief — pretty much everything. And it would exist outside the framework of The Washington Post. It was not a request for more financing for his venture within the Post called Wonkblog, which we had financed to the tune of millions of dollars over many years.” Financially sensible, but still wrong Baron went on to say that Klein was looking for a sum equivalent to 10 percent of the Post‘s entire annual newsroom budget, which works out to about $10 million (although Vox Media CEO Jim Bankoff, Klein’s new boss, has cast some doubt on that figure in the past), and that he simply couldn’t justify spending that kind of money — especially on something that would live outside the Post as a separate entity, controlled by Klein. Marty Baron is not a stupid man, so this argument makes a lot of sense. He is running a large newspaper, one whose financial health is still somewhat dodgy — despite the fact that it is now owned by Amazon founder Jeff Bezos — and spending $10 million on a wild idea from a single blogger to create a kind of news Wikipedia probably seems like a dumb thing to do. As far as we can tell, Bezos thought it was dumb as well, or he would have cut some kind of deal. But as Felix Salmon at Reuters points out, this is exactly the kind of bet that the Post arguably needs to be making at this point. Does it need to husband its resources and try to make its existing business more economically sound? Of course it does. But it needs to make some big bets as well, I think — and backing Klein would have been a smart one, as I tried to point out before he left the paper to start what became Vox. Josh Benton said in his post that some of the coverage of Klein’s departure falls into the “preconceived narrative that Newspaper People Are Dumb And Internet People Get It,” and there is much truth to that. And all I would argue is that newspaper people like Baron need to take some leaps of faith if they are to survive the disruption going on all around them. As Salmon notes, what Klein was proposing wasn’t really that different from the model that Kara Swisher and Walt Mossberg pioneered with All Things Digital — a standalone site that was wholly owned by Dow Jones, but run as a separate entity. Of course, Dow Jones arguably failed to really take advantage of that model even when it was working well, and eventually Swisher and Co. left to join NBC Digital instead with Re/code (is that because Dow Jones is dumb? Of course not. Just unable to see the forest for all the lumber). An alternate future that could have been As more than one person noted even before Klein left the Post, the newspaper has already missed one potentially large opportunity of this kind by allowing the team that formed Politico to leave and set up a competing organization. Why make it a two-fer with Klein? Picture this as an alternative future: the Post chooses to keep Politico’s founders in-house and gives them enough financing to start a standalone site that effectively accomplishes the same thing, except that it is either wholly-owned or majority-owned by the Post. Then the paper funds Ezra Klein’s Project X using the same model, and maybe one or two other things as well. What does that future look like compared to the one it has now? In some ways, as Salmon points out, Ezra Klein is better off with a company like Vox, which understands digital technology and how to build new platforms — something the Post understands about as well as it understands quantum mechanics. And Politico is probably better off as well being a standalone entity. In fact, its debatable whether either one would have become what it is now if it had been started within the Post. But it would have been interesting to see them try, and the Post would likely have learned a lot in the process. And if they had succeeded, it would have had an ownership stake in two digital-only entities that could have become growth engines in a way the Post website likely never will. Post and photo thumbnails courtesy of Thinkstock / Janie Airey and Flickr users Son of Broccoli and George KellyRelated research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.Evolution of the E-book MarketHow to get maximum productivity in today’s workplaceFrenemy mine: The pros and cons of social partnerships for online media companies

Read More...
posted 11 days ago on gigaom
The public price wars are fun to watch. A day after Google announced sweeping price cuts last month, Amazon Web Services lowered prices for the 42nd time in its history. Moreover, Microsoft, which is trying to be as big in the cloud as AWS and Google, matched AWS’ price cuts. At Structure 2014, we have all the major cloud players at the same event to tell you why their cloud is best, and, perhaps, now the cheapest. Joining us will be Google’s Urs Hölzle, a senior VP for Google’s technical infrastructure and a Google fellow; Scott Guthrie, corporate VP of the Microsoft Cloud and Enterprise division; and Amazon Web Service’s CTO, Dr. Werner Vogels. Will Google’s plans in the cloud computing market continue to dictate what AWS and Microsoft offer? What does Microsoft’s new transformation to an IaaS cloud provider mean for new and potential users? Moreover, is the concept of not pushing Windows in the Azure cloud really a good idea? And will the 800-pound gorilla who currently has a huge market lead on both Microsoft and Google be able to keep up the momentum? What will the race for dominance in the cloud computing market look like in five years? Come to Structure 2014 to better understand what’s out there, where it’s going and how it will impact your enterprise. Register for Structure by this Friday, April 11 and save $300. –Clare Ryan

Read More...
posted 11 days ago on gigaom
Free Mobile has already kicked off a price war in France with its ultra-cheap mobile voice and data plans. It could be looking to lead a consolidation drive in its home country as well. According to a report in Le Parisien,Free’s parent company Iliad is in discussions with Bouygues Telecom to buy out its wireless operations, haggling over a price somewhere between €5 billion (USD $6.9 billion) and €8 billion ($11 billion). Free’s aggressive pricing is already putting pressure on French mobile operators. Vivendi decided to sell off SFR, France’s third-largest carrier, to cable provider Numericable. Bouygues was in the running to buy SFR as well, but failing to win the prize, it appears to have become an acquisition target itself. Free Mobile has kicked of a French Revolution of its own (source: The Louvre) If the deal goes through, Free would gain access to a nationwide mobile voice and data network, something it has so far managed to do without. Free has adopted a Wi-Fi-first strategy by turning Iliad customers’ broadband gateways into hybrid public-private hotspots. The idea is to move as much mobile traffic as possible onto these those Wi-Fi nodes to avoid the cost of using expensive cellular networks. Wi-Fi-first is a strategy that virtual operators Republic Wireless and Scratch Wireless have adopted in the U.S., and it may even wind up being the path that Comcast and Google use to get into the mobile carrier business. Free is augmenting that Wi-Fi network with home femtocells and strategically placed cellular towers, but it’s also tapping Orange’s cellular networks through a wholesale agreement to fill in the many gaps in between. Buying Bouygues means Free would no longer need Orange, which as the largest carrier in France is also Free’s biggest competitor. But by running its own extensive cellular network, Free’s operational and infrastructure costs would increase enormously. Free’s lean Wi-Fi-first strategy would wind getting much more bulbous.Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.Gigaom Research predictions for 2014What to watch in mobile in 2013How the mobile-first world will transform the data center

Read More...
posted 11 days ago on gigaom
The Senate Judiciary Committee began grilling Comcast at 10 AM ET on Wednesday morning over its $45.2 billion proposal to swallow Time Warner Cable. The hearing, which comes a day after the company justified the merger in an FCC filing, features Comcast executives, consumer advocates and academics explaining why the deal is good — and bad — for consumers. The hearing will showcase some of the potential antitrust issues related to the deal, which could in turn lead the Justice Department to threaten a lawsuit to stop it. Here’s an overview of those issues, and what people are saying about them. Vertical, not horizontal, concerns Comcast has been quick to point out that its acquisition of Time Warner Cable would not affect horizontal competition since the two companies don’t operate in the same markets. It has even volunteered to divest 3 million subscribers so that the combined entity would have less than 30 percent of the pay TV market. This puts the deal on solid ground when it comes to concerns about eliminating competitors. “There’s not much in terms of horizontal competition problems,” said Andre Barlow, an antitrust lawyer who formerly worked at the Justice Department. “The government really needs to make out a great vertical case to support an anti-competitive theory.” Such a vertical case, based on antitrust issues up and down the cable TV supply chain, is a harder case to make, especially as the government has rarely succeeded on vertical antitrust theories in the past. Monopoly and monopsony Among the wonks, the debate over Comcast acquiring Time Warner Cable has led to renewed discussion of “monopsony” — the notion of an all-powerful buyer –rather than monopoly, which relates to an all-powerful seller. In this case, the combined cable company’s monopsony would stem from its unprecedented power over content creators like TV studios and broadcasters: Comcast could be in a position to demand that the studios lower prices or abide by other terms in order to reach Comcast subscribers. As Reuters and the New York Times reported, the monopsony theory has been getting traction in the Justice Department and among academics. Successful legal challenges, however, have been few, and a case based on a monopsony theory would be even harder if the arrangement resulted in lower prices being passed on to consumers. (Indeed, Comcast would point out that broadcasters have typically come out on top in retransmission fee disputes with cable companies, leading to higher consumer cable bills.) While a monopsony case is unlikely, the Justice Department could still decide to act based on more familiar grounds related to Section 2 of the Sherman Act, which addresses abuse of market power. And any such case would not look just at cable TV, but larger internet issues. Look at broadband, not cable As my colleague Stacey Higginbotham explained yesterday, this deal isn’t really about cable, but about broadband services. It is not about a choice of cable company, but instead about deciding who will control the the pipe of information that comes into our home alongside our gas and electricity. And it is on this front that the antitrust issues are most profound. If Comcast and Time Warner Cable merge, the combined company could control at least 40 percent of the country’s broadband market. Despite Comcast’s claims that there are plenty of broadband options, the reality is that most households only have one or two choices for high-speed internet. One problem, as Jon Brodkin of Ars Technica explained this week, is that it is extremely difficult for new internet service providers to get off the ground — in part because of legal threats they face from incumbents. At the same time, Comcast’s existing control over content giants like NBC, and recent disputes with the likes of Netflix over content delivery, have already raised the question of whether further consolidation of broadband services will harm consumers. The fate of the proposed merger, then, will turn on what degree the senators focus on the pay TV issues — and what degree they focus instead on the real issue of broadband delivery.Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.Gigaom Research predictions for 2014How the truly smart home could finally become a realityHow consumer media consumption shifted in the second quarter

Read More...
posted 11 days ago on gigaom
Q&A platform Quora has raised $80 million in Series C funding, the company announced Wednesday. The round was led by Tiger Global and including participation from Benchmark, Matrix, Northbridge, and Peter Thiel. Mark Bodnick, Quora’s head of business operations, declined to confirm how much money the company has raised in total since it launched in 2010, but estimates based on reports at the time of those rounds put total money raised to $141 million. Bodnick said that the money will largely be banked away to keep the platform online indefinitely, and may also be used for international expansion. Right now, Quora can only be accessed in English, but the company will likely begin translating into international languages to boost accessibility. No particular plans or languages have been decided upon, but Bodnick said that it’s a high priority down the road — and one of the ways that Bodnick says Quora continues to model itself after Wikipedia. “We want to keep the service running no matter what happens in the future,” Bodnick said. “ Quora has been angling for the past few years to set up a monetization model, whether through ads or other means, but nothing has yet to come to fruition and the company won’t confirm that any plans are in place. Since its launch in 2010, Quora has gathered content on more than 500,000 topics. It most recently introduced a verified profiles system, which it launched with President Barack Obama’s account. Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.Listening platforms: finding the value in social media dataManaging infinite choice: the new era of TV user interfacesConnected Consumer Q3: Netflix fumbles; Kindle Fire shines

Read More...
posted 11 days ago on gigaom
Smart homes, connected cars and personal health care devices may grab the bulk of the headlines, but every business can benefit from the internet of things (IoT). Whether you want to add intelligence to factories, increase the efficiency of your shipping, advertise to new demographics, or simply deliver enterprise applications to the personal devices of your employees and/or customers, the IoT should be central to your IT strategy. The sky is the limit, but getting from here to there is no simple task. Connecting end points to backends with the necessary speed, scale and security takes planning, focus and a clear understanding of the end goal. In this webinar, our panel will address these topics: How will the IoT affect businesses of various sorts? What are the benefits to greater device connectivity? Which businesses and industries have made best use of the IoT so far? What traits do they share? What are the most common implementation challenges? What process and technology changes must businesses make to support IoT-based initiatives? Speakers include: David S. Linthicum, SVP, Cloud Technology Partners Craig Foster, freelance analyst, writer and consultant Rich Morrow, founder and head geek, quicloud  Jonas Jacobi, co-founder and president, Kaazing Register here to join Gigaom Research and our sponsor KAAZING for “The internet of things: making it happen in your business,” a free analyst roundtable webinar on Tuesday, April 22, 2014, at 10:00 a.m. PT.

Read More...
posted 11 days ago on gigaom
One of the issues Amazon Web Services customers cite is how difficult it can be to track and project their cloud costs over time — there’s just so much stuff up there! Well, Amazon apparently has been paying attention — it just announced Cost Explorer, a tool that shows a customer’s cloud spending for the current month and “automatically pre-populates your last 4 months of AWS spend so you can visualize your AWS costs, and start analyzing trends and spending patterns,” according to an AWS blog post. One AWS partner said this is a tool AWS needs to offer. “Cost management continues to be one of the top pain points we hear from the AWS customers we speak to. Given billing is so core to the service AWS provides, I’ve been surprised AWS hasn’t been more aggressive here,” said Izzy Azeri, co-founder of Stackdriver said via email. “Based on the description to do granular filtering and event based cost analysis, I think this can be very compelling to customers.  My guess is this is just the tip of the iceberg in terms of spend management and optimization tools for AWS.” Cost Explorer comes with some pre-set views — monthly spend broken out by service, monthly spend by linked account and daily spend. One AWS watcher said it was unclear how Cost Explorer differs from Amazon’s existing CloudWatch service, although the latter appears to track usage of various AWS resources but not the associated costs. I’ve reached out to AWS for clarification here. A cottage industry has grown up around helping AWS customers assess, map out and cut their AWS costs — third parties like Cloudability and Cloudyn play here — but Amazon keeps adding more user-friendly dashboards, including Trusted Advisor, which helps customers optimize their use of AWS services. Related research and analysis from Gigaom Research:Subscriber content. Sign up for a free trial.Sponsored Research: How direct-access solutions can speed up cloud adoptionHow carriers can catch up in the cloud raceBig data: beyond analytics

Read More...