ETL Developer – Talend
Job Duties:
- ETL Developer is responsible for Design and Development of ETL Jobs which follow standards,
best practices and are maintainable, modular and reusable.
- Proficiency with Talend or Pentaho Data Integration / Kettle.
- ETL Developer will analyze and review complex object and data models and the metadata
repository in order to structure the processes and data for better management and efficient
access.
- Working on multiple projects, and delegating work to Junior Analysts to deliver projects on time.
- Training and mentoring Junior Analysts and building their proficiency in the ETL process.
- Preparing mapping document to extract, transform, and load data ensuring compatibility with
all tables and requirement specifications.
- Experience in ETL system design and development with Talend / Pentaho PDI is essential.
- Create quality rules in Talend.
- Tune Talend / Pentaho jobs for performance optimization.
- Write relational(sql) and multidimensional(mdx) database queries.
- Functional Knowledge of Talend Administration Center/ Pentaho data integrator, Job Servers &
Load balancing setup, and all its administrative functions.
- Develop, maintain, and enhance unit test suites to verify the accuracy of ETL processes,
dimensional data, OLAP cubes and various forms of BI content including reports, dashboards,
and analytical models.
- Exposure in Map Reduce components of Talend / Pentaho PDI.
- Comprehensive understanding and working knowledge in Data Warehouse loading, tuning, and
maintenance.
- Working knowledge of relational database theory and dimensional database models.
- Creating and deploying Talend / Pentaho custom components is an add-on advantage.
- Nice to have java knowledge.
Skills and Qualification:
- BE, B.Tech / MS Degree in Computer Science, Engineering or a related subject.
- Having an experience of 3+ years.
- Proficiency with Talend or Pentaho Data Integration / Kettle.
- Ability to work independently.
- Ability to handle a team.
- Good written and oral communication skills.
About Helical IT Solutions
About
Connect with the team
Similar jobs
Overview:
The Senior Operations Executive plays a pivotal role in overseeing the daily operations of the organization. He/she is responsible for ensuring smooth functioning across various departments, optimizing processes, and implementing strategies to enhance efficiency and productivity. This position will report directly to the Operations Manager and collaborate with department heads to achieve organizational goals.
Key Responsibilities:
Strategic Planning:
- Develop and implement operational strategies aligned with the organization's objectives.
- Identify opportunities for process improvement and cost optimization.
Process Optimization:
- Analyse existing operational processes and identify opportunities for improvement.
- Develop and implement strategies to streamline workflows and enhance efficiency.
- Establish and maintain standard operating procedures (SOPs)
Cross-Functional Collaboration:
- Collaborate with various departments to understand their support needs.
- Address any operational issues that may arise and work towards resolution.
Technology Integration:
- Identify and evaluate new technologies or tools that can improve operational processes.
- Oversee the implementation of new systems and ensure proper training for the team.
Budget Management:
- Assist in the development and management of the operations support budget.
- Monitor expenditures and implement cost-saving measures where possible.
Managing Operational teams:
- Managing functional teams like technical operations, Legal, Business operations, etc, and reviewing daily/monthly planning and goals.
Vendor Management:
- Identify and evaluate potential vendors for operational support services.
- Negotiate contracts and agreements to ensure favourable terms for the organization.
- Collaborate with vendors to resolve issues and improve service delivery.
Qualifications:
- Must have:
- Engineering (Software/IT/Computer) +2 years of work experience + MBA (Operations/ Business Management) +3 years of experience
- Certification in project management or operations management (e.g., PMP, Six Sigma).
- Proficiency in MS Tools, JIRA, Atlassian, Confluence, MIRO (or any similar tool)
- Proven experience in operations management or a similar role.
- Strong leadership and team management skills.
- Excellent analytical and problem-solving abilities.
- Proficiency in project management tools and software.
- Sound knowledge of industry regulations and compliance standards.
Additional Requirements:
- Flexibility to adapt to changing priorities and work in a fast-paced environment.
- Ability to make sound decisions under pressure.
- Commitment to maintaining confidentiality and integrity.
- Willingness to travel occasionally as and when required.
Benefits and perks:
- Progressive leave policy for effective work-life balance.
- Training and Certification budget for your professional growth and development.
- Company-sponsored workcation once a year.
- Multicultural peer groups and supportive workplace policies.
- Celebrate monthly team events and fun-filled outings.
Our client is a leading nature-based personal care products company backed by a Private Equity firm. It manufactures and markets Ayurveda/nature-based hair and skincare products under a well-known brand known for its commitment to marry the principles of Ayurveda with the best of natural ingredients to deliver outstanding products to customers across the globe.
Must be a self-starter with a strong orientation to build a high growth brand.
Requirements
1) Translate customer insights into concepts, personas, user journeys, storyboards, system maps, user flows, wireframes, visual design comps and prototypes with Adobe CS, Figma, and other tools
2) Create and ensure technical feasibility of UI/UX Design with active collaboration with Engineering and Product teams in iterative product development cycles
3) Manage and support the entire design process from concept preparation to final artwork as an individual contributor
4) Communicate the product's vision and articulate design decisions and trade-offs both verbally and visually
5) Incorporate industry standards, trends, user research and usability testing to iterate on product proposals
6) Define and execute product branding elements in collaboration with the design team and marketing team that include in-product deliverables as well as product-related marketing materials for Facebook, Instagram and such.
Skills Required:
1) Bachelor's Degree or equivalent in Computer Science, Design, Multimedia, Human-Computer Interaction, Graphic Design or equivalent
2) High attention to detail and demonstrated a flair for iterative design, information architecture and creative visualization.
3) Proficiency in the use of popular design software such as Adobe XD, Figma, Photoshop, Illustrator, Affinity Designer, or similar mainstream medium.
4) Ability to maximize the design potential of any medium - Instagram Posts, YouTube teasers, infographics, Word and PowerPoint templates
5) Flair for articulating ideas pictorially and verbally
- Required to work individually or as part of a team on data science projects and work closely with lines of business to understand business problems and translate them into identifiable machine learning problems which can be delivered as technical solutions.
- Build quick prototypes to check feasibility and value to the business.
- Design, training, and deploying neural networks for computer vision and machine learning-related problems.
- Perform various complex activities related to statistical/machine learning.
- Coordinate with business teams to provide analytical support for developing, evaluating, implementing, monitoring, and executing models.
- Collaborate with technology teams to deploy the models to production.
Key Criteria:
- 2+ years of experience in solving complex business problems using machine learning.
- Understanding and modeling experience in supervised, unsupervised, and deep learning models; hands-on knowledge of data wrangling, data cleaning/ preparation, dimensionality reduction is required.
- Experience in Computer Vision/Image Processing/Pattern Recognition, Machine Learning, Deep Learning, or Artificial Intelligence.
- Understanding of Deep Learning Architectures like InceptionNet, VGGNet, FaceNet, YOLO, SSD, RCNN, MASK Rcnn, ResNet.
- Experience with one or more deep learning frameworks e.g., TensorFlow, PyTorch.
- Knowledge of vector algebra, statistical and probabilistic modeling is desirable.
- Proficiency in programming skills involving Python, C/C++, and Python Data Science Stack (NumPy, SciPy, Pandas, Scikit-learn, Jupyter, IPython).
- Experience working with Amazon SageMaker or Azure ML Studio for deployments is a plus.
- Experience in data visualization software such as Tableau, ELK, etc is a plus.
- Strong analytical, critical thinking, and problem-solving skills.
- B.E/ B.Tech./ M. E/ M. Tech in Computer Science, Applied Mathematics, Statistics, Data Science, or related Engineering field.
- Minimum 60% in Graduation or Post-Graduation
- Great interpersonal and communication skills
Be Part Of Building The Future
Dremio is the Data Lake Engine company. Our mission is to reshape the world of analytics to deliver on the promise of data with a fundamentally new architecture, purpose-built for the exploding trend towards cloud data lake storage such as AWS S3 and Microsoft ADLS. We dramatically reduce and even eliminate the need for the complex and expensive workarounds that have been in use for decades, such as data warehouses (whether on-premise or cloud-native), structural data prep, ETL, cubes, and extracts. We do this by enabling lightning-fast queries directly against data lake storage, combined with full self-service for data users and full governance and control for IT. The results for enterprises are extremely compelling: 100X faster time to insight; 10X greater efficiency; zero data copies; and game-changing simplicity. And equally compelling is the market opportunity for Dremio, as we are well on our way to disrupting a $25BN+ market.
About the Role
The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for innovative minds with experience in leading and building high quality distributed systems at massive scale and solving complex problems.
Responsibilities & ownership
- Lead, build, deliver and ensure customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product.
- Work on distributed systems for data processing with efficient protocols and communication, locking and consensus, schedulers, resource management, low latency access to distributed storage, auto scaling, and self healing.
- Understand and reason about concurrency and parallelization to deliver scalability and performance in a multithreaded and distributed environment.
- Lead the team to solve complex and unknown problems
- Solve technical problems and customer issues with technical expertise
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Mentor other team members for high quality and design
- Collaborate with Product Management to deliver on customer requirements and innovation
- Collaborate with Support and field teams to ensure that customers are successful with Dremio
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 8+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models, and their use in developing distributed and scalable systems
- 5+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Hands-on experience in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, and storage systems
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Ability to solve ambiguous, unexplored, and cross-team problems effectively
- Hands on experience of working projects on AWS, Azure, and Google Cloud Platform
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and Google Cloud)
- Understanding of distributed file systems such as S3, ADLS, or HDFS
- Excellent communication skills and affinity for collaboration and teamwork
- Ability to work individually and collaboratively with other team members
- Ability to scope and plan solution for big problems and mentors others on the same
- Interested and motivated to be part of a fast-moving startup with a fun and accomplished team