Having problems with ingesting and processing Big Data? Let BDP do the tough job for you!!
BDP is a service which assists you with the complete cycle of Big Data Ingestion. It extracts the data from various OLTP systems, then processes it and finally ingests it into the Hadoop Distributed File System(HDFS). Having problems with ingesting and processing Big Data? Let BDP do the tough job. It consists of four different modules which performs the ingesting and processing of data as below:
- Extractor: Finds input files of various types and formats.
- MData builder: Builds and includes metadata of the files being processed.
- Packager: Re-packages input files into avro files and splits larger files into configurable sizes.
- Transporter: Moves the processed files to HDFS for further processing.
BDP is designed as a service with the ability to run multiple application instances on the same host. This provides the flexibility to process multiple groups of streams independently and thus preventing any backlogging. It also comes with many advanced features like Auto Restore and Quarantine during file transfers. It also supports Kerberos for ensuring high security. Any troubles? You can find it all in the extensive Logs and Troubleshooting. Your job made easy!!
Contact Us for a Demo.