China is a country of extremes, with well-developed industrialized cities flourishing while inhabited yet rugged and primitive regions struggle.
One of the remotest and historically poorest provinces in Southwest China—Guizhou—has come a particularly long way in a short time and is well on its way to becoming a hub for China’s push into big data. What resembled suburbia a decade ago has been converted into a new urban district complete with skyscrapers, a convention center, and data centers.
High-speed railways, bridges, tunnels, and added international flights linking it to domestic and foreign cities have lifted the province from isolation and connected it with the world.
Ranked 25th out of 31 Chinese provinces economically, Guizhou has hosted the country’s four-day International Big Data Expo three years in a row, and by the time the 2017 event concluded at the end of May, exhibiting companies inked contracts worth $2.4 billion, according to a report by NPR.
Many technology behemoths made the trek to the Far East for the event: Apple, Facebook, Microsoft, Google, Amazon, Intel, IBM, and Dell were all there. Silicon Valley elites such as Stanford’s AI and ethics professor Jerry Kaplan, start-up entrepreneur and creator of Founders Space Adelyn Zhou, and Steve Hoffman, regional lead of developer relations at Google were there as well, according to the Expo website.
With an average year-round temperature of 59 degrees, Guizhou is well-suited for data centers, and the central government has done an admirable job of attracting firms with pilot programs and discounts on hydro electricity.
Taiwanese electronics company Foxconn, which in addition to iPhones, Kindles, PlayStations, and other gadgets manufactures servers, has a factory and a 6,000-server Green Tunnel data center located an hour’s drive from the city.
Like many companies in China, Foxconn is trying to make its manufacturing operations more efficient through the use of cloud computing, networked machines and eventually, artificial intelligence. All of this requires storing and analyzing huge amounts of data.
All copyrights for this article are reserved to datacenterknowledge.com