

About Kloud9:
Kloud9 exists with the sole purpose of providing cloud expertise to the retail industry. Our team of cloud architects, engineers and developers help retailers launch a successful cloud initiative so you can quickly realise the benefits of cloud technology. Our standardised, proven cloud adoption methodologies reduce the cloud adoption time and effort so you can directly benefit from lower migration costs.
Kloud9 was founded with the vision of bridging the gap between E-commerce and cloud. The E-commerce of any industry is limiting and poses a huge challenge in terms of the finances spent on physical data structures.
At Kloud9, we know migrating to the cloud is the single most significant technology shift your company faces today. We are your trusted advisors in transformation and are determined to build a deep partnership along the way. Our cloud and retail experts will ease your transition to the cloud.
Our sole focus is to provide cloud expertise to retail industry giving our clients the empowerment that will take their business to the next level. Our team of proficient architects, engineers and developers have been designing, building and implementing solutions for retailers for an average of more than 20 years.
We are a cloud vendor that is both platform and technology independent. Our vendor independence not just provides us with a unique perspective into the cloud market but also ensures that we deliver the cloud solutions available that best meet our clients' requirements.
What we are looking for:
● 3+ years’ experience developing Data & Analytic solutions
● Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive& Spark
● Experience with relational SQL
● Experience with scripting languages such as Shell, Python
● Experience with source control tools such as GitHub and related dev process
● Experience with workflow scheduling tools such as Airflow
● In-depth knowledge of scalable cloud
● Has a passion for data solutions
● Strong understanding of data structures and algorithms
● Strong understanding of solution and technical design
● Has a strong problem-solving and analytical mindset
● Experience working with Agile Teams.
● Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders
● Able to quickly pick up new programming languages, technologies, and frameworks
● Bachelor’s Degree in computer science
Why Explore a Career at Kloud9:
With job opportunities in prime locations of US, London, Poland and Bengaluru, we help build your career paths in cutting edge technologies of AI, Machine Learning and Data Science. Be part of an inclusive and diverse workforce that's changing the face of retail technology with their creativity and innovative solutions. Our vested interest in our employees translates to deliver the best products and solutions to our customers.

About Kloud9 Technologies
About
Kloud9 was founded with the vision of enabling our customers to transform into an intelligent enterprise with our “AI-First” approach. We help our customers in their data transformation and insights transformation journeys and enable them to make smart business decisions.
At Kloud9, we know AI & ML is one of the key technologies that can help organizations to significantly improve their customer experiences and transform their business operations to enable them to survive & thrive in this global competitive market.
Photos
Similar jobs

We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as:
- Dataflow for real-time and batch data processing
- Cloud Functions for lightweight serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer for orchestration of data workflows (based on Apache Airflow)
- Google Cloud Storage (GCS) for managing data at scale
- IAM for access control and security
- Cloud Run for containerized applications
- Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
- Implement and enforce data quality checks, validation rules, and monitoring.
- Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
- Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
- Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
- Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
- 4–8 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
Location: Chennai
Ideal Candidates for this Role should have:
- Minimum 5 years experience in testing.
- The candidate should have strong experience in Rest Assured, API Testing and Java.
- Proficiency in API testing tools like Postman
- Good knowledge in automation testing using Selenium WebDriver with Java.
- The candidate should be able to write a program to retrieve/parse the Json
- Good understanding of REST API types (GET, PUT, POST, DELETE) and its working.
- Good understanding of HTTP, JSON protocol and syntax
- Good knowledge in automation testing using Selenium WebDriver with Java.
Be able to work as an individual contributor and be an Expert at writing automated scripts and be 100% hands-on
Should have automated 1000s of Test cases at various complexity levels and workflows - Design, implementation, and delivery of scalable, maintainable, configurable and robust test automation frameworks.
- Ability to analyze and translate requirements and development stories into test scripts to ensure complete test coverage.
- Should have strong knowledge of continuous integration tools like Hudson and Jenkins.
- Strong understanding of testing and automation best practices.
- Proven experience in functional, regression and cross-browser testing.
- Willingness to learn new technologies, approaches and test tools
InViz is Bangalore Based Startup helping Enterprises simplifying the Search and Discovery experiences for both their end customers as well as their internal users. We use state-of-the-art technologies in Computer Vision, Natural Language Processing, Text Mining, and other ML techniques to extract information/concepts from data of different formats- text, images, videos and make them easily discoverable through simple human-friendly touchpoints.
TSDE - Data
Data Engineer:
- Should have total 3-6 Yrs of experience in Data Engineering.
- Person should have experience in coding data pipeline on GCP.
- Prior experience on Hadoop systems is ideal as candidate may not have total GCP experience.
- Strong on programming languages like Scala, Python, Java.
- Good understanding of various data storage formats and it’s advantages.
- Should have exposure on GCP tools to develop end to end data pipeline for various scenarios (including ingesting data from traditional data bases as well as integration of API based data sources).
- Should have Business mindset to understand data and how it will be used for BI and Analytics purposes.
- Data Engineer Certification preferred
Experience in Working with GCP tools like |
|
Store : CloudSQL , Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore |
|
Ingest : Stackdriver, Pub/Sub, AppEngine, Kubernete Engine, Kafka, DataPrep , Micro services |
|
Schedule : Cloud Composer |
|
Processing: Cloud Dataproc, Cloud Dataflow, Cloud Dataprep |
|
CI/CD - Bitbucket+Jenkinjs / Gitlab |
|
Atlassian Suite |
|
|
.


Skills:
Full Stack Developer
Developed Gaming Application
Developed 3D Gaming Application
Basic knowledge of C, C++, MySQL, Mongo db


Our Client is B2B SaaS Product Co. in the space of HR Technology. They are helping organisations to take informed decisions in the areas like Hiring, Training and Career Succession processes. The company was formed in 2010 and since has become a market leader in HR technology space. The founders are alumni of Stanford University and their employees have experience in working with PWC, McKinsey and other similar leagues of organisations.With a bright vision of the founders, the organisation is in an expansion mode to capture niche markets and become a global leader in this domain.
- Experience in Back-End development using Ruby on Rails or NodeJS
- Experience in working on at least two of MongoDB / Postgres / MySQL & Redis
- Experience on MVC patterns using frameworks like Rails, ExpressJS
- Strong understanding of RESTful APIs and HTTP protocol
- Understanding Security aspects of the applications and can successfully implement OWASP compliant systems
- Strong understanding of Linux OS, File Systems, Firewalls etc
- 3 years Experience in Ruby on Rails
- Minimum 3 years in MongoDB / PostgreSQL
- Must be from Product based companies
- Develop system and program specifications to meet client needs
- Analyze the impact of new systems or system changes on existing technology
- Design and maintain database structures to support system requirements
- Build and configure CRM forms, views, dashboards, and workflows
- Create and maintain schema and code to support system/program requirements
- Adhere to company standards, policies, and procedures
- Create and maintain appropriate user and technical documentation
- Provide client support, as needed
- Work with clients to resolve problems in a timely manner and escalate as appropriate
- Participate in company initiatives and committees
Qualifications
- Bachelor’s degree or higher degree in Computer Engineering, Computer Science, Information Technology, or any related field.
- Experience with CRM 2013/2015/2016, D365 CE modules such as Finance/Supply Chain/Sales/Marketing/Service/Field Services and CRM Portal
- Microsoft Dynamics Sales and Customer Engagement applications (CRM) product experience
- Professional CRM certification OR agree to achieve appropriate certification in an agreed-upon timeframe
- Experience implementing and customizing Dynamics CRM and Dynamics 365
- Demonstrated analytical, problem-solving, organizational, interpersonal, communication skills
- The ability to handle multiple tasks, simultaneously
- The ability to work independently or as part of a team
- Knowledge in Azure technologies like API, Azure Functions, Logic Apps development is advantageous
An ideal candidate must possess excellent Logical & Analytical skills. You will be working in a team as well on diverse projects. The candidate must be able to deal smoothly and confidently with the Clients & Personnel.
Key roles and Responsibilities:
⦁ Able to design and build efficient, testable and reliable code.
⦁ Should be a team player sharing ideas with the team for continuous improvement and development process.
⦁ Good Knowledge on Spring Boot, Spring MVC, J2EE and SQL Queries.
⦁ Stay updated of new tools, libraries, and best practices.
⦁ Adaptable, Self-Motivated, must be willing to learn new things.
⦁ Sound Good knowledge on HTML, CSS, JavaScript.
Basic Requirements:
⦁ Bachelors' Degree in Computer Science Engineering / IT or related discipline with a good academic record.
⦁ Excellent communication skills and interpersonal skills.
⦁ Knowledge on SDLC flow from requirement analysis to deployment phase.
⦁ Should be able to design, develop and deploy applications.
⦁ Able to identify bugs and devise solutions to address and resolve the issues.



