Datavail
Datavail Infotech - Quality Assurance Lead - ETL Testing
Job Location
hyderabad, India
Job Description
Quality Assurance Lead | MCA/B.E/B.Tech/B.Sc/Any Graduation. Exp : 8 to 10 years. Location : Description : - Minimum of 5 years of relevant experience in ETL Testing. - Should be proficient in Writing SQL's for RDBMS. - Should have good Knowledge of Data Warehouse, Facts, Dimension, and ETL Full and Incremental loads. - Should be able to develop SQL's to reconcile the data across source to targets. - Should have decent understanding of Slowly Changing Dimensions and should be able to Test and Reconcile the Historical and Current Data. - Data Accuracy : Ensure the correctness of data loaded into Snowflake through Talend ETL processes. - Perform validations to ensure data is accurate and meets business requirements. - ETL Testing : Conduct thorough testing of Talend-based ETL processes that extract, transform, and load data from source systems to Snowflake. - This includes validating transformations, schema, data quality rules, and any business logic applied. - End-to-End Testing : Perform end-to-end testing to verify that data flows seamlessly from source to Snowflake without errors and in compliance with business rules. - Data Reconciliation : Reconcile data between source systems and Snowflake to ensure that data is correctly transferred and transformed without any data loss or discrepancies. - Data Quality Rules : Define and implement data quality rules in Talend to ensure that data in Snowflake is clean, accurate, and consistent. - This may involve removing duplicates, handling missing values, or correcting data formats. - Automation of Data Quality Checks : Implement automated data quality checks in Talend to monitor data integrity and trigger alerts or notifications if any data quality issues arise. - Anomaly Detection : Monitor data loads and transformations to identify anomalies, errors, or discrepancies in data and raise flags for corrective action. - Performance Optimization : Optimize Talend ETL jobs to improve processing time and efficiency, ensuring that data loads to Snowflake are fast and efficient. - Query and ETL Performance Tuning : Work with data engineers and Talend developers to tune queries, SQL scripts, and ETL processes to minimize latency and optimize overall performance. - Cost Efficiency : Work with Snowflake's cost management tools and Talend performance practices to ensure data processing is efficient and does not incur unnecessary costs. - Issue Identification : Identify data issues in Snowflake resulting from Talend ETL jobs or queries and troubleshoot the source of these issues. - This includes data quality problems, performance bottlenecks, or data transformation errors. - Root Cause Analysis : Perform root cause analysis on data quality issues and collaborate with developers, data engineers, and other team members to resolve underlying issues. - Data Recovery : Handle cases where ETL processes fail and ensure that data is recovered correctly without impacting downstream systems or reports. - Stakeholder Communication : Work closely with business analysts, data engineers, and developers to understand the data requirements and ensure that both Talend and Snowflake meet business needs. - Requirements Gathering : Gather data transformation and data integration requirements from business users to design accurate ETL processes in Talend and ensure correct data modelling in Snowflake. - Collaboration on Data Governance : Collaborate with data governance teams to ensure data security, privacy, and compliance requirements are met. - Test Case Documentation : Document detailed test cases, test scripts, and test results related to Talend ETL processes and Snowflake data quality. - Data Quality Reports : Generate reports that summarize data quality metrics, testing outcomes, and any data-related issues discovered during the testing process. - Process Documentation : Maintain documentation on ETL workflows, Snowflake data structures, data transformation logic, and other key data pipeline elements. - Process Optimization : Continuously improve data quality processes, automation scripts, and ETL jobs to make them more efficient and scalable. - Tool Upgrades : Stay updated with new features and best practices in both Snowflake and Talend, incorporating improvements into existing workflows. - Knowledge Sharing : Share insights and best practices with team members, helping to standardize processes for data quality management. - Compliance with Standards : Ensure that the ETL processes in Talend and data stored in Snowflake comply with organizational data governance standards and regulatory requirements (e.g., GDPR, HIPAA). - Audit and Review : Perform data audits and reviews to ensure data in Snowflake is trustworthy and secure. - Data Lineage : Maintain data lineage within Talend and Snowflake to trace how data moves from source to destination and ensure compliance and audit readiness. - Monitoring Data Loads : Set up real-time monitoring for data pipelines in Talend and ensure Snowflake performance is optimized for large-scale data loads. - Scalability : Work on scalability improvements to accommodate growing data volumes within both Talend ETL jobs and Snowflake data warehouse storage and processing capabilities. - Report Testing : Validate and test reports and dashboards created based on Snowflake data to ensure that business intelligence tools are providing accurate and actionable insights. - End-User Validation : Ensure that the data displayed in end-user reports and dashboards aligns with the business needs and is backed by accurate data transformations from Talend to Snowflake. - Experience with Software Quality Assurance. - SQL. - Azure / AWS. - Database. - Performance. - Governance. (ref:hirist.tech)
Location: hyderabad, IN
Posted Date: 5/9/2025
Location: hyderabad, IN
Posted Date: 5/9/2025
Contact Information
Contact | Human Resources Datavail |
---|