
Job Summary:
We are looking for a motivated and detail-oriented Data Engineer with 1–2 years of experience to join our data engineering team. The ideal candidate should have solid foundational skills in SQL and Python, along with exposure to building or maintaining data pipelines. You’ll play a key role in helping to ingest, process, and transform data to support various business and analytical needs.
Key Responsibilities:
- Assist in the design, development, and maintenance of scalable and efficient data pipelines.
- Write clean, maintainable, and performance-optimized SQL queries.
- Develop data transformation scripts and automation using Python.
- Support data ingestion processes from various internal and external sources.
- Monitor data pipeline performance and help troubleshoot issues.
- Collaborate with data analysts, data scientists, and other engineers to ensure data quality and consistency.
- Work with cloud-based data solutions and tools (e.g., AWS, Azure, GCP – as applicable).
- Document technical processes and pipeline architecture.
Core Skills Required:
- Proficiency in SQL (data querying, joins, aggregations, performance tuning).
- Experience with Python, especially in the context of data manipulation (e.g., pandas, NumPy).
- Exposure to ETL/ELT pipelines and data workflow orchestration tools (e.g., Airflow, Prefect, Luigi – preferred).
- Understanding of relational databases and data warehouse concepts.
- Familiarity with version control systems like Git.
Preferred Qualifications:
- Experience with cloud data services (AWS S3, Redshift, Azure Data Lake, etc.)
- Familiarity with data modeling and data integration concepts.
- Basic knowledge of CI/CD practices for data pipelines.
- Bachelor’s degree in Computer Science, Engineering, or related field.

About Wissen Technology
About
The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains.
With offices in US, India, UK, Australia, Mexico, and Canada, we offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
Leveraging our multi-site operations in the USA and India and availability of world-class infrastructure, we offer a combination of on-site, off-site and offshore service models. Our technical competencies, proactive management approach, proven methodologies, committed support and the ability to quickly react to urgent needs make us a valued partner for any kind of Digital Enablement Services, Managed Services, or Business Services.
We believe that the technology and thought leadership that we command in the industry is the direct result of the kind of people we have been able to attract, to form this organization (you are one of them!).
Our workforce consists of 1000+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like MIT, Wharton, IITs, IIMs, and BITS and with rich work experience in some of the biggest companies in the world.
Wissen Technology has been certified as a Great Place to Work®. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients.
Connect with the team
Similar jobs
Senior Software Engineer.
- A degree in Data Science, Artificial Intelligence, Computer Science, or a related field
- Strong proficiency in Python, with a solid understanding of object-oriented programming principles.
- Proven experience in developing applications powered by Large Language Models (LLMs).
- Demonstrated experience in building and maintaining robust ETL (Extract, Transform, Load) data pipelines.
- Ability to build full-stack applications using JavaScript; experience with modern frameworks like React and Next.js is highly desirable.
- A pragmatic and self-starting individual, capable of executing tasks effectively with minimal supervision.
- Experience developing modern frontend applications, with specific expertise in using Web Components.
- Knowledge of simulation principles and discrete optimisation techniques.
- Relevant professional certifications in AI, Machine Learning, or cloud platforms are a plus.
Experience
- Proven track record building AI (e.g. LLM-powered solutions, Agent-Based Applications), data science, or numerical computing projects end-to-end—from initial concept through to deployment and maintenance.
- Extensive hands-on experience (4+ years) with core technologies, including:
- Advanced Python, utilising libraries such as NumPy, Pandas, and frameworks for multiprocessing and asynchronous programming.
- Relational database management systems.
- Development and deployment within Linux environments.
- Collaborative development workflows using Git.
We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as:
- Dataflow for real-time and batch data processing
- Cloud Functions for lightweight serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer for orchestration of data workflows (based on Apache Airflow)
- Google Cloud Storage (GCS) for managing data at scale
- IAM for access control and security
- Cloud Run for containerized applications
- Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
- Implement and enforce data quality checks, validation rules, and monitoring.
- Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
- Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
- Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
- Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
- 4–8 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
4+ years of experience in Test Automation using Selenium and C# with SpecFlow
- Selenium Experience: Minimum 3 Years
- C# with SpecFlow Experience: Minimum 2 Years
Job Description
- 4+ years of experience in Test Automation using Selenium and C# with SpecFlow
- Understanding and Analyzing the Application Under Test in terms of Object Identification.
- Creating Test Scenarios and Collecting Test Data.
- Identifying end-to-end scenarios and code modularity.
- Analyzing Test Results and Reporting Defects
- Tracking Defects and Select Test Cases for Re & Regression Testing.
- Creating, Organizing, and Managing Test Automation Resources (Scripts & Function Libraries etc.)
- Error handlings
- Co-coordinating Test team members and Development team to resolve the issues.
- Working experience on Azure DevOps
Services: Data Scientist
Exp 8+yrs
Location: Pune (preference)
As a Data Scientist who can help us with our innovation lab initiative, you will be responsible for extracting insights from large datasets and turning them into actionable strategies to drive business decisions. You will collaborate with cross-functional teams to identify opportunities for leveraging data to solve business problems and improve processes.
Advanced Degree: PhD or equivalent in Computer Science, Data Science, Statistics, Mathematics, or a related field, demonstrating a strong foundation in theoretical and practical aspects of AI and machine learning.
Deep Technical Expertise: Proven track record of applying advanced statistical models, machine learning algorithms, and computational methodologies to solve complex problems in various domains.
Hands-on Experience with AI Technologies: Extensive experience in designing, implementing, and deploying AI solutions, including but not limited to natural language processing, computer vision, and predictive analytics projects.
Strong Programming Skills: Proficiency in programming languages such as Python, R, or Scala, and familiarity with AI and machine learning frameworks (e.g., TensorFlow, PyTorch).
Data Manipulation and Analysis: Expertise in handling and analyzing large datasets using SQL, NoSQL, and data processing frameworks (e.g., Spark).
Excellent Communication Skills: Ability to clearly articulate complex technical concepts, findings, and the business value of AI solutions to both technical and non-technical stakeholders.
Understanding of AI Ethics and Bias: Knowledge of ethical AI practices, including bias detection and mitigation strategies, to ensure the development of fair and responsible AI systems.
Problem-Solving Ability: Demonstrated ability to apply analytical and creative thinking to complex challenges, with a track record of innovative solutions in AI.
Collaboration and Leadership: Experience working in cross-functional teams, with the ability to lead projects and mentor junior team members in AI methodologies and best practices.
Industry-Specific AI Application Knowledge: Demonstrated experience in applying AI solutions to real-world problems in industries such as healthcare, finance, retail, or manufacturing, showing a deep understanding of sector-specific challenges and opportunities.
LLM Development and Customization: Hands-on experience in developing, fine-tuning, and deploying Large Language Models for applications such as conversational agents, content generation, and semantic analysis.
Advanced Natural Language Understanding (NLU) and Generation (NLG): Expertise in leveraging LLMs for complex NLU and NLG tasks, demonstrating a deep understanding of language model capabilities and limitations.
About Intraedge: https://intraedge.com/
Intraedge is a Technology, Products and Learning Organization, It was founded in 2002 with offices in the US, India, Europe, Canada, and Singapore. We provide our clients with the resources and expertise to enhance business performance through technology.
Qualifications:
• M Tech/ B Tech/ BE/ MCA/ MS in Computer Science, Information Technology, or equivalent.
• The degree must be obtained through a regular on-campus program from a recognized university.
• Proven experience as a .NET Developer with 4-9 years of relevant work experience.
• Strong proficiency in Dot Net and MVC architecture.
• Experience with MySQL database management.
• Knowledge of front-end technologies such as HTML, CSS, and JavaScript.
Jamboree Education Pvt. Ltd. is hiring for Faculty Verbal (GMAT/ GRE/ SAT) at below locations:
Delhi, Bangalore, Mumbai, and Kolkata.
Job Description:
Teaching Verbal Ability to students preparing for GRE and SAT at the study centers of Jamboree as per Jamboree's pedagogy.
Required Candidate profile:
Preferred experience in teaching Verbal for competitive exams like GMAT, GRE, SAT, CAT and other exams.
Extremely well adapted and trained on teaching the concepts of Verbal/English.
Excellent communication ability.
6 days working including weekends, any one day off between Mon to Fri.
11am to 8pm on weekdays and 10am to 8pm on weekends
Who are we?
We are a venture capital-backed software development company headquartered in Canada. We develop in-house products to disrupt one industry at a time and partner as a technology service provider to selected startups.
Who are you?
Experience in writing applications using Nodejs including Express or similar.
Must be good in MySQL or one of the databases such as Mongo.
Proficient in Javascript and good experience and knowledge of open source tools, frameworks, broader cutting-edge technologies around server-side development.
Excellent data structure, algorithm, and problem-solving skills.
Created and consumed various APIs in the past.
Should be an active contributor to developer communities like Stack Overflow, GitHub, Google Developer Groups (GDGs).
Customer-focused, react well to changes, work with teams, and able to multi-task.
Must be a proven performer and team player that enjoys challenging assignments in a high-energy, fast-growing, and start-up workplace.
Must be a self-starter who can work well with minimal guidance and in a fluid environment.
Some of the technologies we use are:
NodeJS
ExpressJS
Angular 9
AWS
Github
PSQL, MongoDB
Selenium
Responsibilities:
- Quickly test & evaluate the hyperledger Besu vs GoQuorum to arrive at the best platform which fits our use-case
- Focus on core-blockchain infrastructure as viewed through the privacy aspects, scalability & deployment challenges in the real world
- Designing, coding, testing as well as deployment, documentation & support
- Solve interesting technical & business challenges
- Collaborate with our mobile/web team to integrate & promote best practices













