
Informatica PowerCenter (9x ,10.2) : Minimum 2+ years experience
SQL / PLSQL: Understanding of SQL procedure. Able to convert procedures into Informatica mapping.
Good to have- Advantage if you have knowledge of Windows Batch Script.

Similar jobs
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Must Have Skills:
- Overall 10+ years of experience in application development using Golang.
- Experience in designing and developing REST based services / Microservice development.
- Ability to design scalable, robust, and error-tolerant systems.
- Understanding of software architecture and distributed systems.
- Proficient in writing efficient and optimized algorithms under time constraints.
- Skilled in developing solutions that balance performance, readability, and maintainability.
- Ability to effectively communicate coding decisions and rationale during problem-solving discussions.
- Hands-on experience with queuing mechanisms such as Kafka or RabbitMQ.
- Candidates should be adaptable and eager to quickly learn and integrate into the existing tech stack if they lack direct experience.
- Candidate should have good communication skills (written and verbal).
- Experience with delivering projects in an agile environment using SCRUM methodologies.
Good to have:
- Experience to AWS, CI/CD, DevOps.
- Experience using container management tools such as Kubernetes, Docker and Rancher.
- Any one of these data store Cassandra, Postgres, Couchbase, or other NoSQL servers.
Job Description:
- Should have hands-on experience in Web Development
- Good understanding of PHP, Laravel and Object-oriented programming paradigm.
- Able to understand project requirements and handle projects independently.
- Strong learning capability.
- Having a good knowledge of JQuery.
- Framework experience would be beneficial.
- Should be comfortable to work with the team.
- Should be comfortable with work on any MVC-based framework.
Skills required:
- Sound knowledge of PHP,MySQL, Jquery, etc.
- Able to understand project requirement and handle projects independently.
- Strong learning capability.
- Contribute in all phases of the development.
- Knowledge of PHP/Codeigniter/Laravel will be preferred.
- Basic Knowledge of JavaScript, Web Services.
Job Description - Data Engineer
About us
Propellor is aimed at bringing Marketing Analytics and other Business Workflows to the Cloud ecosystem. We work with International Clients to make their Analytics ambitions come true, by deploying the latest tech stack and data science and engineering methods, making their business data insightful and actionable.
What is the role?
This team is responsible for building a Data Platform for many different units. This platform will be built on Cloud and therefore in this role, the individual will be organizing and orchestrating different data sources, and
giving recommendations on the services that fulfil goals based on the type of data
Qualifications:
• Experience with Python, SQL, Spark
• Knowledge/notions of JavaScript
• Knowledge of data processing, data modeling, and algorithms
• Strong in data, software, and system design patterns and architecture
• API building and maintaining
• Strong soft skills, communication
Nice to have:
• Experience with cloud: Google Cloud Platform, AWS, Azure
• Knowledge of Google Analytics 360 and/or GA4.
Key Responsibilities
• Work on the core backend and ensure it meets the performance benchmarks.
• Designing and developing APIs for the front end to consume.
• Constantly improve the architecture of the application by clearing the technical backlog.
• Meeting both technical and consumer needs.
• Staying abreast of developments in web applications and programming languages.
Key Responsibilities
• Design and develop platform based on microservices architecture.
• Work on the core backend and ensure it meets the performance benchmarks.
• Work on the front end with ReactJS.
• Designing and developing APIs for the front end to consume.
• Constantly improve the architecture of the application by clearing the technical backlog.
• Meeting both technical and consumer needs.
• Staying abreast of developments in web applications and programming languages.
What are we looking for?
An enthusiastic individual with the following skills. Please do not hesitate to apply if you do not match all of it. We are open to promising candidates who are passionate about their work and are team players.
• Education - BE/MCA or equivalent.
• Agnostic/Polyglot with multiple tech stacks.
• Worked on open-source technologies – NodeJS, ReactJS, MySQL, NoSQL, MongoDB, DynamoDB.
• Good experience with Front-end technologies like ReactJS.
• Backend exposure – good knowledge of building API.
• Worked on serverless technologies.
• Efficient in building microservices in combining server & front-end.
• Knowledge of cloud architecture.
• Should have sound working experience with relational and columnar DB.
• Should be innovative and communicative in approach.
• Will be responsible for the functional/technical track of a project.
Whom will you work with?
You will closely work with the engineering team and support the Product Team.
Hiring Process includes :
a. Written Test on Python and SQL
b. 2 - 3 rounds of Interviews
Immediate Joiners will be preferred
1. Communicate with the clients and understand their business requirements.
2. Build, train, and manage your own team of junior data engineers.
3. Assemble large, complex data sets that meet the client’s business requirements.
4. Identify, design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
5. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources, including the cloud.
6. Assist clients with data-related technical issues and support their data infrastructure requirements.
7. Work with data scientists and analytics experts to strive for greater functionality.
Skills required: (experience with at least most of these)
1. Experience with Big Data tools-Hadoop, Spark, Apache Beam, Kafka etc.
2. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
3. Experience in ETL and Data Warehousing.
4. Experience and firm understanding of relational and non-relational databases like MySQL, MS SQL Server, Postgres, MongoDB, Cassandra etc.
5. Experience with cloud platforms like AWS, GCP and Azure.
6. Experience with workflow management using tools like Apache Airflow.
Job description:-
Build and maintain customer relationships in order to understand their needs and business priorities.
Provide excellent customer service to maintain existing affiliates and acquire new affiliates.
Manage business negotiations with customers.Build and maintain customer relationships in order to understand their needs and business priorities. Provide excellent customer service to maintain existing affiliates and acquire new affiliates. Manage business negotiations with customers. Handle business deal tracking, monitoring, closing, and other related activities as needed. Coordinate with various teams to address affiliate needs in accurate and timely manner. Implement affiliate marketing activity including email campaigns, newsletters, blog, etc to increase revenue targets. Recommend process improvements to increase revenue targets. Perform new customer acquisition through research, referrals, networking, cold calling, data feeds and emails. Set marketing and sales goals to achieve revenue growth. Provide training on affiliate management as needed. Maintain open communication with all affiliates on the day-to-day issues. Develop business strategies to improve affiliate program.
Experience:- upto 2 years

- Computer Science fundamentals in object-oriented design, design patterns, data structures, algorithm design,
- Proficiency with Java
- 1+ years of experience contributing to architecture and design in a product setup
Key Responsibilities
- Understand customer needs by collaborating with Product Managers and Business stakeholders
- Design, development, delivery, and support of large-scale, distributed software applications and tools
- Use software engineering best practices to ensure a high standard of quality and maintainability for all deliverables
- Work in an agile, startup-like development environment, where you are always working on the most important stuff.
- Take initiatives and come up with new ideas to invent solutions for customers
Will be developing mobile and web applications using latest technology. Should be good in analysing requirements and translate into applications. Good in understanding application flows.
Tech Skills –
Mobile technology – Hybrid (Ionic / Angular / Cordova ), – Expert level – Should have completed at least one project with server component, use of cloud services, use of device native capabilities like camera, accelerometer etc.
HTML, CSS, Bootstrap, Java script – Expert level - Good understanding of concepts and should be able to realize application screens based on the UI provided by designers.
(This is an internship to begin with, later you may be offered a full-time role.)
Seeking candidates with an exceptional knowledge of Online bidder. Candidates should have an extensive and ongoing experience of working with portals like UPWORK and bringing up the projects for the company.
Expert Level Skills Required In:
- Thorough proficiency in Upwork
- Excellent written and verbal English communication skill is a must.
- Good knowledge of IT Services and experience in Sales and Marketing is a must.
- Should possess good interpersonal skills for Account Handling and Relationship Building.
- Should be proactive and should have excellent convincing skills.
- Dealing with international client
Company: Lincode Labs
Responsibilities:
Establish and maintain a deep understanding of the overall product portfolio and the competitive landscape. Lead technical discovery and prepare/deliver technical presentations explaining our products to prospects and customers. Create and deliver powerful presentations and demos to clients that clearly communicate the uniqueness of the value proposition. Successfully manage and execute technical proof of concepts (POCs), on-site or remote. Responsible for representing the product to customers and at field events such as conferences, seminars, etc. Evangelize Lincode products to prospects, customers, and partners via presentations and product demos. Convey feature input and customer requirements to Product Management teams. Partnering with sales executives to plan, prepare and execute on strategic deals in complex sales cycles. Collaborate with sales teams to understand customer requirements and provide sales support. Respond to technical objections and articulate the value and return on investment delivered. Liaise with the Engineering, Product, Marketing, and Sales teams to provide consultative technical expertise for all customer needs. Effectively communicate & build confidence with customers across teams (Engineering, Product, Marketing, and Sales). Engage in and oversee the development of customer proposals, design and delivery, ensuring all expertise, information and recommendations are concisely defined
Requirements:
2-3 years Sales Engineering experience. Worked previously with Machine Learning/ Computer Vision Companies- preferred Min qualification- Graduates Knowledge about installing Industrial cameras Excellent presentation, written and verbal communication skills to communicate professionally. Self motivated with strong interpersonal and problem-solving skills. Ability to work well in a highly dynamic team.








