Skills and Qualifications
- At least 5 years experience on ETL Development mainly work with DataStage
- Proficient in developing strategies for Extraction, Transformation and Loading (ETL) mechanism.
- Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, Remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator, XML.
- Expert in designing Server jobs using various types of stages like Sequential file, ODBC, Hashed file, Aggregator, Transformer, Sort, Link Partitioner and Link Collector.
- Experienced in integration of various data sources (DB2-UDB, SQL Server, PL/SQL, Oracle, Teradata, XML and MS-Access) into data staging area.
- Expert in working with DataStage Manager, Designer, Administrator, and Director.
- Experience in analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, creating Indexes and Aggregate tables for the data warehouse design and development.
- Expert in Big Data concept and platforms, such as Hadoop, Hive, Beeline, EDL, etc.
- Experience in Agile development and tools, such as Jira, Confluence, Bitbucket, Jenkins, etc.
- Excellent knowledge of studying the data dependencies using metadata stored in the repository and prepared batches for the existing sessions to facilitate scheduling of multiple sessions.
- Proven track record in troubleshooting of DataStage jobs and addressing production issues like performance tuning and enhancement.
- Experienced in Database programming for Data Warehouses (Schemas), proficient in dimensional modeling
- Expertise in UNIX shell scripts using K-shell for the automation of processes and scheduling the DataStage jobs
- Experienced in Data Modeling as well as reverse engineering using tools Erwin and MS Visio
Degrees or certifications
- Bachelor’s degree in a technical field such as computer science, computer engineering or related field required