


• Work with various stakeholders, understand requirements, and build solutions/data pipelines
that address the needs at scale
• Bring key workloads to the clients’ Snowflake environment using scalable, reusable data
ingestion and processing frameworks to transform a variety of datasets
• Apply best practices for Snowflake architecture, ELT and data models
Skills - 50% of below:
• A passion for all things data; understanding how to work with it at scale, and more importantly,
knowing how to get the most out of it
• Good understanding of native Snowflake capabilities like data ingestion, data sharing, zero-copy
cloning, tasks, Snowpipe etc
• Expertise in data modeling, with a good understanding of modeling approaches like Star
schema and/or Data Vault
• Experience in automating deployments
• Experience writing code in Python, Scala or Java or PHP
• Experience in ETL/ELT either via a code-first approach or using low-code tools like AWS Glue,
Appflow, Informatica, Talend, Matillion, Fivetran etc
• Experience in one or more of the AWS especially in relation to integration with Snowflake
• Familiarity with data visualization tools like Tableau or PowerBI or Domo or any similar tool
• Experience with Data Virtualization tools like Trino, Starburst, Denodo, Data Virtuality, Dremio
etc.
• Certified SnowPro Advanced: Data Engineer is a must.

Similar jobs
We are seeking a strong ASP.NET MVC / C# Developer to join our team and work directly with our UK-based client. The role requires not just technical depth but also excellent communication skills, independence, and leadership qualities. You will be working closely with the client on a near-daily basis, understanding requirements, proposing solutions, and delivering high-quality software.
Key Responsibilities ● Develop, maintain, and enhance applications built on ASP.NET MVC, C#, and SQL Server. ● Work on specific UI workflows that use ReactFlow (ReactJS). ● Collaborate with UK clients daily to gather requirements and provide updates. ● Write clean, scalable, and maintainable code with strong attention to quality. ● Take ownership of features, work independently, and deliver within deadlines. ● Gradually take a leadership role in guiding best practices and mentoring peers.
Required Skills ● C# / ASP.NET MVC: Strong hands-on experience. ● SQL Server: Solid understanding of relational databases and writing optimized queries. ● Communication: Strong English speaking and writing skills, confidence in client-facing interactions. ● Independence: Ability to work without continuous hand-holding. ● Personality: Outgoing, proactive, and able to naturally emerge as a leader.
Nice to Have ● ReactJS experience, especially with ReactFlow. ● Exposure to Agile/Scrum practices.
We are looking for US IT Recruiter/Technical Recruiter/ Talent Acquisition for our organisation GeniQom Technologies located in City Centre, Siliguri. Candidates looking for a long-term stint would be preferred. The ability to multitask and prioritize at any given time would definitely grab our attention.
Responsibilities :
Responsible for handling IT and non IT requirements from US based direct clients.
Sourcing from various Job Portals and social networking sites like Monster, indeed, LinkedIn etc.
Responsible for Strategy Development process which includes understanding client requirements & mapping the relevant targets.
Conduct initial screenings, reference checks, negotiate pay rate, relocation, coordinate client interviews and work with the Account Managers to close the position.
Responsible for achieving a good conversion ratio of submittals into interview and placement.
Qualification & Skills
Must have basic knowledge of Computer MS (word, excel, Power Point).
Educational Qualification - Minimum Graduate or undergraduate (or equivalent) in any discipline.
Prefer candidate of BPO or Tele Calling background,Preferred US Shift.
Good Communication skill.
Fluent English is must.
Timing: 6:30 PM to 3:30 AM IST
Shift: Night (US shift)
Working days: Monday to Friday.
Salary: Negotiable (+Incentives)

Senior Data Engineer
Responsibilities:
● Clean, prepare and optimize data at scale for ingestion and consumption by machine learning models
● Drive the implementation of new data management projects and re-structure of the current data architecture
● Implement complex automated workflows and routines using workflow scheduling tools
● Build continuous integration, test-driven development and production deployment frameworks
● Drive collaborative reviews of design, code, test plans and dataset implementation performed by other data engineers in support of maintaining data engineering standards
● Anticipate, identify and solve issues concerning data management to improve data quality
● Design and build reusable components, frameworks and libraries at scale to support machine learning products
● Design and implement product features in collaboration with business and Technology stakeholders
● Analyze and profile data for the purpose of designing scalable solutions
● Troubleshoot complex data issues and perform root cause analysis to proactively resolve product and operational issues
● Mentor and develop other data engineers in adopting best practices
● Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders
Qualifications:
● 8+ years of experience developing scalable Big Data applications or solutions on distributed platforms
● Experience in Google Cloud Platform (GCP) and good to have other cloud platform tools
● Experience working with Data warehousing tools, including DynamoDB, SQL, and Snowflake
● Experience architecting data products in Streaming, Serverless and Microservices Architecture and platform.
● Experience with Spark (Scala/Python/Java) and Kafka
● Work experience with using Databricks (Data Engineering and Delta Lake components)
● Experience working with Big Data platforms, including Dataproc, Data Bricks etc
● Experience working with distributed technology tools including Spark, Presto, Databricks, Airflow
● Working knowledge of Data warehousing, Data modeling
● Experience working in Agile and Scrum development process
● Bachelor's degree in Computer Science, Information Systems, Business, or other relevant subject area
Role:
Senior Data Engineer
Total No. of Years:
8+ years of relevant experience
To be onboarded by:
Immediate
Notice Period:
Skills
Mandatory / Desirable
Min years (Project Exp)
Max years (Project Exp)
GCP Exposure
Mandatory Min 3 to 7
BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep .Spark and PySpark
Mandatory Min 5 to 9
Relational SQL
Mandatory Min 4 to 8
Shell scripting language
Mandatory Min 4 to 8
Python /scala language
Mandatory Min 4 to 8
Airflow/Kubeflow workflow scheduling tool
Mandatory Min 3 to 7
Kubernetes
Desirable 1 to 6
Scala
Mandatory Min 2 to 6
Databricks
Desirable Min 1 to 6
Google Cloud Functions
Mandatory Min 2 to 6
GitHub source control tool
Mandatory Min 4 to 8
Machine Learning
Desirable 1 to 6
Deep Learning
Desirable Min 1to 6
Data structures and algorithms
Mandatory Min 4 to 8

- Must have 3+ years’ experience in ASP .NET with C# and VB.NET
- Must have at least 2 years’ experience in SQL Server
- Must have experience working with SOAP and REST Web Services
- Must have experience with SSRS and SSIS
- Must have some experience with MVC framework and Angular JS
- Must be able to work with basic CSS and HTML
- Must be able to work with jQuery and JavaScript
- Experience with Reporting Services, WCF, etc. would be a strong positive
Basic understanding of SEO, Search, Display, and Remarketing campaigns knowledge.
Should be ready to learn new things and take on challenges.
Should have a professional approach towards work.
Basic Knowledge -
Creating back-link
Competitors backlink
Local Business Listing
Bookmarking
profile submission
Link building through guest blogging
Submitting images on relevant image submission websites
Knowledge of Analytics/Webmaster
Classified Ad Submission
Q/A on Quora
Web 2.0 Submission
PPT Creation and Submission


Work Location - Bangalore
The Data Analytics Senior Analyst is a seasoned professional role. Applies in-depth disciplinary knowledge, contributing to the development of new techniques and the improvement of processes and work-flow for the area or function. Integrates subject matter and industry expertise within a defined area. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the function and overall business. Evaluates moderately complex and variable issues with substantial potential impact, where development of an approach/taking of an action involves weighing various alternatives and balancing potentially conflicting situations using multiple sources of information. Requires good analytical skills in order to filter, prioritize and validate potentially complex and dynamic material from multiple sources. Strong communication and diplomacy skills are required. Regularly assumes informal/formal leadership role within teams. Involved in coaching and training of new recruits. Significant impact in terms of project size, geography, etc. by influencing decisions through advice, counsel and/or facilitating services to others in area of specialization. Work and performance of all teams in the area are directly affected by the performance of the individual.
Responsibilities:
- Build and enhance the software stack for modelling and data analytics
- Incorporate relevant data related algorithms in the products to solve business problems and improve them over time
- Automate repetitive data modelling and analytics tasks
- Keep up to date with available relevant technologies, to solve business problems
- Become a subject matter expert and closely work with analytics users to understand their need & provide recommendations/solutions
- Help define/share best practices for the business users and enforce/monitor that best practices are being incorporated for better efficiency (speed to market & system performance)
- Share daily/weekly progress made by the team
- Work with senior stakeholders & drive the discussions independently
- Mentor and lead a team of software developers on analytics related product development practices
Qualifications:
- 10-13 years of data engineering experience.
- Experience in working on machine-learning model deployment/scoring, model lifecycle management and model performance measurement.
- In-depth understanding of statistics and probability distributions, with experience of applying it in big-data software products for solving business problems
- Hands-on programming experience with big-data and analytics related product development using Python, Spark and Kafka to provide solutions for business needs.
- Intuitive with good interpersonal-skills, time-management and task-prioritization
- Ability to lead a technical team of software developers and mentor them on good software development practices.
- Ability to quickly grasp the business problem and nuances when put forth.
- Ability to quickly put together an execution plan and see it through till closure.
- Strong communication, presentation and influencing skills.
Education:
- Bachelor’s/University degree or equivalent experience
- Data Science or Analytics specialization preferred
Job Description :
Only Female candidates are eligible
Qualification: Graduation
Age: Around 30 years.
Minimum 6 months to 1 year of experience in the related field is essential.
Having good exposure over Ms-Office.
Need administrative and organizational abilities.
Need to have excellent written and oral communication in English.
Intercultural sensitivity and multitasking capacity.
Readiness to take initiative and to solve problems.
Sociable, self-motivated and initiative taker.



