Position Overview: We are looking for an experienced and highly skilled Senior Data Engineer to join our team and help design, implement, and optimize data systems that support high-end analytical solutions for our clients. As a customer-centric Data Engineer, you will work closely with clients to understand their business needs and translate them into robust, scalable, and efficient technical solutions. You will be responsible for end-to-end data modelling, integration workflows, and data transformation processes while ensuring security, privacy, and compliance.In this role, you will also leverage the latest advancements in artificial intelligence, machine learning, and large language models (LLMs) to deliver high-impact solutions that drive business success. The ideal candidate will have a deep understanding of data infrastructure, optimization techniques, and cost-effective data management
Key Responsibilities:
• Customer Collaboration:
– Partner with clients to gather and understand their business
requirements, translating them into actionable technical specifications.
– Act as the primary technical consultant to guide clients through data challenges and deliver tailored solutions that drive value.
•Data Modeling & Integration:
– Design and implement scalable, efficient, and optimized data models to support business operations and analytical needs.
– Develop and maintain data integration workflows to seamlessly extract, transform, and load (ETL) data from various sources into data repositories.
– Ensure smooth integration between multiple data sources and platforms, including cloud and on-premise systems
• Data Processing & Optimization:
– Develop, optimize, and manage data processing pipelines to enable real-time and batch data processing at scale.
– Continuously evaluate and improve data processing performance, optimizing for throughput while minimizing infrastructure costs.
• Data Governance & Security:
–Implement and enforce data governance policies and best practices, ensuring data security, privacy, and compliance with relevant industry regulations (e.g., GDPR, HIPAA).
–Collaborate with security teams to safeguard sensitive data and maintain privacy controls across data environments.
• Cross-Functional Collaboration:
– Work closely with data engineers, data scientists, and business
analysts to ensure that the data architecture aligns with organizational objectives and delivers actionable insights.
– Foster collaboration across teams to streamline data workflows and optimize solution delivery.
• Leveraging Advanced Technologies:
– Utilize AI, machine learning models, and large language models (LLMs) to automate processes, accelerate delivery, and provide
smart, data-driven solutions to business challenges.
– Identify opportunities to apply cutting-edge technologies to improve the efficiency, speed, and quality of data processing and analytics.
• Cost Optimization:
–Proactively manage infrastructure and cloud resources to optimize throughput while minimizing operational costs.
–Make data-driven recommendations to reduce infrastructure overhead and increase efficiency without sacrificing performance.
Qualifications:
• Experience:
– Proven experience (5+ years) as a Data Engineer or similar role, designing and implementing data solutions at scale.
– Strong expertise in data modelling, data integration (ETL), and data transformation processes.
– Experience with cloud platforms (AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark).
• Technical Skills:
– Advanced proficiency in SQL, data modelling tools (e.g., Erwin,PowerDesigner), and data integration frameworks (e.g., Apache
NiFi, Talend).
– Strong understanding of data security protocols, privacy regulations, and compliance requirements.
– Experience with data storage solutions (e.g., data lakes, data warehouses, NoSQL, relational databases).
• AI & Machine Learning Exposure:
– Familiarity with leveraging AI and machine learning technologies (e.g., TensorFlow, PyTorch, scikit-learn) to optimize data processing and analytical tasks.
–Ability to apply advanced algorithms and automation techniques to improve business processes.
• Soft Skills:
– Excellent communication skills to collaborate with clients, stakeholders, and cross-functional teams.
– Strong problem-solving ability with a customer-centric approach to solution design.
– Ability to translate complex technical concepts into clear, understandable terms for non-technical audiences.
• Education:
– Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or a related field (or equivalent practical experience).
LIFE AT FOUNTANE:
- Fountane offers an environment where all members are supported, challenged, recognized & given opportunities to grow to their fullest potential.
- Competitive pay
- Health insurance for spouses, kids, and parents.
- PF/ESI or equivalent
- Individual/team bonuses
- Employee stock ownership plan
- Fun/challenging variety of projects/industries
- Flexible workplace policy - remote/physical
- Flat organization - no micromanagement
- Individual contribution - set your deadlines
- Above all - culture that helps you grow exponentially!
A LITTLE BIT ABOUT THE COMPANY:
Established in 2017, Fountane Inc is a Ventures Lab incubating and investing in new competitive technology businesses from scratch. Thus far, we’ve created half a dozen multi-million valuation companies in the US and a handful of sister ventures for large corporations, including Target, US Ventures, and Imprint Engine.
We’re a team of 120+ strong from around the world that are radically open-minded and believes in excellence, respecting one another, and pushing our boundaries to the furthest it's ever been.

Similar jobs
Requirements
- Design, implement, and manage CI/CD pipelines using Azure DevOps, GitHub, and Jenkins for automated deployments of applications and infrastructure changes.
- Architect and deploy solutions on Kubernetes clusters (EKS and AKS) to support containerized applications and microservices architecture.
- Collaborate with development teams to streamline code deployments, releases, and continuous integration processes across multiple environments.
- Configure and manage Azure services including Azure Synapse Analytics, Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), and other data services for efficient data processing and analytics workflows.
- Utilize AWS services such as Amazon EMR, Amazon Redshift, Amazon S3, Amazon Aurora, IAM policies, and Azure Monitor for data management, warehousing, and governance.
- Implement infrastructure as code (IaC) using tools like Terraform or CloudFormation to automate provisioning and management of cloud resources.
- Ensure high availability, performance monitoring, and disaster recovery strategies for cloud-based applications and services.
- Develop and enforce security best practices and compliance policies, including IAM policies, encryption, and access controls across Azure environments.
- Collaborate with cross-functional teams to troubleshoot production issues, conduct root cause analysis, and implement solutions to prevent recurrence.
- Stay current with industry trends, best practices, and evolving technologies in cloud computing, DevOps, and container orchestration.
**Qualifications: **
- Bachelor’s degree in Computer Science, Engineering, or related field; or equivalent work experience.
- 5+ years of experience as a DevOps Engineer or similar role with hands-on expertise in AWS and Azure cloud environments.
- Strong proficiency in Azure DevOps, Git, GitHub, Jenkins, and CI/CD pipeline automation.
- Experience deploying and managing Kubernetes clusters (EKS, AKS) and container orchestration platforms.
- Deep understanding of cloud-native architectures, microservices, and serverless computing.
- Familiarity with Azure Synapse, ADF, ADLS, and AWS data services (EMR, Redshift, Glue) for data integration and analytics.
- Solid grasp of infrastructure as code (IaC) tools like Terraform, CloudFormation, or ARM templates.
- Experience with monitoring tools (e.g., Prometheus, Grafana) and logging solutions for cloud-based applications.
- Excellent troubleshooting skills and ability to resolve complex technical issues in production environments.
Job Role: Placement Co-Ordinator
Roles & Responsibilities:
- Proactively identify, contact, and build relationships with HRs, hiring managers, and recruiters across IT and digital industries.
- Search and monitor job sites, industry platforms, and portals for relevant openings
- Share relevant job openings with eligible candidates and ensure timely application submissions.
- Coordinate and execute successful placement drives to connect students
- Reach out to recruiters and hiring managers to understand their workforce requirements
- Maintain and update placement data for transparency and accountability.
Key Skills Required:
- Must have strong english communication skills
- Extroverted and outgoing personality.
- Any experience is fine.
- Strong ability to identify and build relationships
- Excellent communication, research, and networking skills.
- Effective organizational and reporting abilities
- Ability to work as part of a team.
Candidates from Western line will be preferred.
Job Types: Full-time, Permanent
Pay: ₹18,000.00 - ₹30,000.00 per month
Office Timing: 9.00 am - 6 pm | Mon - Sat | 6 Days Working

Job Summary:
We are looking for a skilled and motivated Python AWS Engineer to join our team. The ideal candidate will have strong experience in backend development using Python, cloud infrastructure on AWS, and building serverless or microservices-based architectures. You will work closely with cross-functional teams to design, develop, deploy, and maintain scalable and secure applications in the cloud.
Key Responsibilities:
- Develop and maintain backend applications using Python and frameworks like Django or Flask
- Design and implement serverless solutions using AWS Lambda, API Gateway, and other AWS services
- Develop data processing pipelines using services such as AWS Glue, Step Functions, S3, DynamoDB, and RDS
- Write clean, efficient, and testable code following best practices
- Implement CI/CD pipelines using tools like CodePipeline, GitHub Actions, or Jenkins
- Monitor and optimize system performance and troubleshoot production issues
- Collaborate with DevOps and front-end teams to integrate APIs and cloud-native services
- Maintain and improve application security and compliance with industry standards
Required Skills:
- Strong programming skills in Python
- Solid understanding of AWS cloud services (Lambda, S3, EC2, DynamoDB, RDS, IAM, API Gateway, CloudWatch, etc.)
- Experience with infrastructure as code (e.g., CloudFormation, Terraform, or AWS CDK)
- Good understanding of RESTful API design and microservices architecture
- Hands-on experience with CI/CD, Git, and version control systems
- Familiarity with containerization (Docker, ECS, or EKS) is a plus
- Strong problem-solving and communication skills
Preferred Qualifications:
- Experience with PySpark, Pandas, or data engineering tools
- Working knowledge of Django, Flask, or other Python frameworks
- AWS Certification (e.g., AWS Certified Developer – Associate) is a plus
Educational Qualification:
- Bachelor's or Master’s degree in Computer Science, Engineering, or related field
Job Title: DevOps Engineer
Location: Bangalore, Karnataka
Job Type: Full-time
Experience: 3-5 years
Notice Period: Immediate or within 15 days
About Us:
Wissen Technology is a dynamic and forward-thinking organization that thrives on innovation and excellence. We are looking for a talented DevOps Engineer passionate about Linux, networking, and cloud technologies to join our team. If you enjoy solving complex problems, automating everything, and working in a fast-paced environment, we want to hear from you!
Key Responsibilities:
- Linux Mastery: Utilize your solid understanding of Linux systems to manage, monitor, and maintain our infrastructure. Your expertise in Linux is crucial for ensuring system stability and performance. You are good at Linux troubleshooting of virtual machine resources like CPU and memory.
- Networking Proficiency: Leverage your networking knowledge to design, deploy, and troubleshoot network infrastructure, ensuring secure and efficient communication between systems and services.
- Cloud Management: Working knowledge of cloud technologies (Azure, AWS, GCP) and Kubernetes to deploy, manage, and scale our services. Your role will involve ensuring our cloud infrastructure is reliable, cost-effective, and scalable.
- Automation and Scripting: Ability to automate smaller repetitive tasks either in Unix shell script or Python. Develop and maintain CI/CD pipelines using tools like Jenkins and Argo to automate deployments and improve our software delivery process. Automate repetitive tasks to streamline operations and improve efficiency.
- Collaboration: Work closely with development teams to ensure smooth integration of new software and services. Provide guidance on best practices for system architecture and deployment.
Qualifications:
- Mandatory: Strong understanding of Linux and Unix systems.
- Networking: Solid knowledge of networking concepts, including DNS, TCP/IP, VPN, firewalls, and load balancing.
- Cloud: Experience with at least one major cloud platform (AWS, Azure, GCP) and Kubernetes.
- Strong scripting skills (Bash, Python, etc.) are highly desirable.
- Familiarity with containerization technologies like Docker and Kubernetes is a bonus.
- Experience with automation tools like Ansible, Puppet, or Chef is a plus.
- Excellent problem-solving skills and the ability to work in a fast-paced environment.
Why Join Us:
- Competitive salary and benefits package.
- Opportunities for professional growth and development.
- Collaborative and inclusive work environment.
- Work with cutting-edge technologies and a talented team.
The JD is as below:
Onboarding customers, collecting all the required documents, executing agreements etc
Collecting repayments and educating customers on credit benefits and timely repayment benefits
Building and maintaining relationships with clients and sellers' ground team
Meeting and exceeding sales targets
Staying up-to-date on company policies and competitor activities
Collaborating with internal teams and anchor teams to ensure the successful product placement
Providing excellent customer service to customers/borrowers
At least 4 years' experience as a Node.js developer.
Experience in Microservices
Extensive knowledge of JavaScript, web stacks, libraries, and frameworks.
Mandatory experience in Postgres and MySql.
Knowledge of front-end technologies such as HTML5 and CSS3.
Superb interpersonal, communication, and collaboration skills.
Exceptional analytical and problem-solving aptitude.


Skill: Node.js,Angular/React,SQL
Notice period: Immediate r 30days
Location: Bangalore
Experience:5 to 8yrs
The Project Manager role we're looking for will manage key client projects. The responsibilities include the coordination and completion of projects on time within budget and within scope.
Oversee all aspects of projects.
Set deadlines, assign responsibilities and monitor and summarize progress of project.
Prepare reports for upper management regarding status of project.
Lead and direct the work of others.
Enhance department and organization reputation by accepting ownership for accomplishing new and different requests; exploring opportunities to add value to job accomplishments.


