Santa Monica, CA (PRWEB) November 20, 2012
As a cloud telephony pioneer and SMS gateway, CallFire is responsible for sending billions of calls and text messages. That is a billion with a B. In response to Hurricane Sandra alone, CallFire sent 1.9 million emergency broadcast notifications. The company hosts a lot of data. As such, the CallFire telephony infrastructure has many moving components and database designs to handle big data.
Here are some metrics to illustrate the data challenge the company faces:
1.4 billion calls and text messages
Over 50,000 accounts
Over 6 million campaigns
80 million sound files
14 TB in storage (Network File Storage)
MySQL : Over 10,000 qps at peak
With the as much data as the CallFire system was processing, the company outgrew its data center. CallFire’s CTO Vijesh Mehta recently outlined how CallFire remedied its big data challenge in a presentation to the LA CTO forum.
In his presentation, Vijesh discusses how CallFire created a solution to work across data centers, scale to meet demand increases, be fault tolerant, and be able to handle over 80 million sound files. For any techies interested in learning more about how to architect data intensive applications, the lecture will surely provide some great insights. To view the presentation slides and learn more how CallFire remedied its data challenge, click here.
About CallFire: CallFire (callfire.com) simplifies telephony, making sophisticated, expensive carrier class telecom capabilities available through an affordable, easy-to-use GUI and API platform. Any business, from start-up to enterprise, can reach its customers on any device, using text messaging or voice, with CallFires massively scalable, cloud telecom platform. CallFire products include Voice APIs, Business Text Messaging, Voice Broadcast, Local Phone Numbers, 800 Numbers, IVR, Call Tracking, Power Dialing for agents and more. Call analytics enable CallFires 50,000 users to reach customers more often using text marketing, virtual numbers, auto dialers and mobile messaging.