Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Google Cloud Unveils New Features to Bigtable; Adds Autoscaling to Improve Overall Manageability

Every data-driven organization shells around 3.5 percent of its revenue on data management. This mostly covers the cost spent trying to optimize its resources around data management, DevOps and automation. It is becoming increasingly difficult for companies to have full control over the cost and manageability of data in the current scenario. To improve overall IT manageability and optimize cost, Google Cloud has announced new features to its Bigtable platform. Bigtable is a fully managed, scalable NoSQL database service for large operational and analytical workloads. Currently, it is operational at some of the biggest brands around the world such as The Home Depot, Equifax, and Twitter.

The latest expansion to Google Cloud now allows Bigtable nodes to support 5TB per node (up from 2.5TB) for SSD and 16TB per node (up from 8TB) for HDD. This is especially cost-effective for batch workloads that operate on large amounts of data.

Emerging technologies in Artificial Intelligence, Machine Learning (AI ML), and Big Data analytics are enabling data-driven organizations to understand and manage the finer nuances of IT and DevOps workflows. To bring in more fluidity to data management at an enterprise level, Google Cloud has announced the general availability of autoscaling for Bigtable that automatically adds or removes capacity in response to the changing demand for your applications.

Why autoscaling? 

Related Posts
1 of 3,865

Autoscaling is part of an infrastructure management suite that allows customers to only pay for services and resources that they require. This allows IT teams to outsource the task of managing infrastructure to platforms such as Google Cloud Bigscale, freeing up time to focus on other important business operations.

Newer capabilities for Bigtable not only help to reduce cost and management overhead but also bring in an additional 2X storage limit that lets you store more data for less. This is particularly valuable for storage optimized workloads, especially in environments where cluster groups are operational. Cluster groups provide flexibility for determining how you route your application traffic to ensure a great experience for your customers. In addition, Cloud Bigtable users also gain access to more granular utilization metrics to improve observability, faster troubleshooting, and workload management.

Cloud Bigtable is particularly useful in data-intensive workflows such as those involved in hyper-personalization, online retail, blockchain and Fintech, programmatic adtech, sports analytics, mobile gaming, and IoT. Thanks to its ML-based foundation, Bigtable can deliver better predictions, empowering IT and DevOps teams with easy integration with the rest of the GCP services, such as BigQuery or Apache.

[To share your insights with us, please write to sghosh@martechseries.com]

Comments are closed.