
Required a passionate graphic designer trainer for IT Institute with relevant experience of 1-3 years. 6 days class per week. 1hour per day class.

Similar jobs
If interested please send your resume at ayushi.dwivedi at cloudsufi.com
Current location of candidate must be Bangalore (as client office visit is required), also candidate must be open for 1 week in a quarter visit to Noida office.
About Us
CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey.
Our Values
We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.
Equal Opportunity Statement
CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, colour, religion, gender, gender identity or expression, sexual orientation and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/
Job Summary
We are seeking a highly skilled and motivated Data Engineer to join our Development POD for the Integration Project. The ideal candidate will be responsible for designing, building, and maintaining robust data pipelines to ingest, clean, transform, and integrate diverse public datasets into our knowledge graph. This role requires a strong understanding of Cloud Platform (GCP) services, data engineering best practices, and a commitment to data quality and scalability.
Key Responsibilities
ETL Development: Design, develop, and optimize data ingestion, cleaning, and transformation pipelines for various data sources (e.g., CSV, API, XLS, JSON, SDMX) using Cloud Platform services (Cloud Run, Dataflow) and Python.
Schema Mapping & Modeling: Work with LLM-based auto-schematization tools to map source data to our schema.org vocabulary, defining appropriate Statistical Variables (SVs) and generating MCF/TMCF files.
Entity Resolution & ID Generation: Implement processes for accurately matching new entities with existing IDs or generating unique, standardized IDs for new entities.
Knowledge Graph Integration: Integrate transformed data into the Knowledge Graph, ensuring proper versioning and adherence to existing standards.
API Development: Develop and enhance REST and SPARQL APIs via Apigee to enable efficient access to integrated data for internal and external stakeholders.
Data Validation & Quality Assurance: Implement comprehensive data validation and quality checks (statistical, schema, anomaly detection) to ensure data integrity, accuracy, and freshness. Troubleshoot and resolve data import errors.
Automation & Optimization: Collaborate with the Automation POD to leverage and integrate intelligent assets for data identification, profiling, cleaning, schema mapping, and validation, aiming for significant reduction in manual effort.
Collaboration: Work closely with cross-functional teams, including Managed Service POD, Automation POD, and relevant stakeholders.
Qualifications and Skills
Education: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field.
Experience: 3+ years of proven experience as a Data Engineer, with a strong portfolio of successfully implemented data pipelines.
Programming Languages: Proficiency in Python for data manipulation, scripting, and pipeline development.
Cloud Platforms and Tools: Expertise in Google Cloud Platform (GCP) services, including Cloud Storage, Cloud SQL, Cloud Run, Dataflow, Pub/Sub, BigQuery, and Apigee. Proficiency with Git-based version control.
Core Competencies:
Must Have - SQL, Python, BigQuery, (GCP DataFlow / Apache Beam), Google Cloud Storage (GCS)
Must Have - Proven ability in comprehensive data wrangling, cleaning, and transforming complex datasets from various formats (e.g., API, CSV, XLS, JSON)
Secondary Skills - SPARQL, Schema.org, Apigee, CI/CD (Cloud Build), GCP, Cloud Data Fusion, Data Modelling
Solid understanding of data modeling, schema design, and knowledge graph concepts (e.g., Schema.org, RDF, SPARQL, JSON-LD).
Experience with data validation techniques and tools.
Familiarity with CI/CD practices and the ability to work in an Agile framework.
Strong problem-solving skills and keen attention to detail.
Preferred Qualifications:
Experience with LLM-based tools or concepts for data automation (e.g., auto-schematization).
Familiarity with similar large-scale public dataset integration initiatives.
Experience with multilingual data integration.
Key Responsibilities:
- Source and attract top IT talent through portals, social media, referrals, and networking
- Conduct initial screening calls and video interviews to assess candidate potential
- Collaborate closely with hiring managers to understand job requirements
- Manage and nurture candidate relationships to ensure a smooth hiring experience
- Keep track of hiring metrics like time-to-hire and quality-of-hire
- Stay updated with trends in IT hiring and recruitment strategies
Requirements:
- 0–2 years of experience in recruitment (IT or non-IT)
- Excellent communication & interpersonal skills
- Eagerness to learn IT roles and technologies
- Familiarity with ATS (Applicant Tracking Systems) is a plus
- Strong time management & multitasking abilities
- Bachelor’s degree in HR, Business, or any relevant field
Developing core infrastructure in Python, Django.
- Developing models and business logic (e. g. transactions, payments, diet plan, search, etc).
- Architecting servers and services that enable new product features.
- Building out newly enabled product features.
- Monitoring system uptime and errors to drive us toward a high-performing and reliable product.
- Take ownership and understand the need for code quality, elegance, and robust infrastructure.
- Worked collaboratively on a software development team.
- Built scalable web applications.
Skills:
- Minimum 4 years of industry or open-source experience.
- Proficient in at least one OO language: Python(preferred)/Golang/Java.
- Writing high-performance, reliable and maintainable code.
- Good knowledge of database structures, theories, principles, and practices.
- Experience working with AWS components [EC2, S3, RDS, SQS, ECS, Lambda]
You will:
- Write excellent production code and tests and help others improve in code-reviews
- Analyze high-level requirements to design, document, estimate, and build systems
- Coordinate across teams to identify, resolve, mitigate and prevent technical issues
- Coach and mentor engineers within the team to develop their skills and abilities
- Continuously improve the team's practices in code-quality, reliability, performance, testing, automation, logging, monitoring, alerting, and build processes
You have:
For (Fullstack):
- 2 - 10 Years of experience
- Strong with DS & Algorithms
- Hands on Experience in the Programming languages: JavaScript (React or Angular), Python, SQL.
- Experience with AWS.
For (Geo Team):
- 4 - 10 years of experience
- Experience with Big Data technologies like Hadoop, Spark, Map Reduce, Kafka, etc
- Experience using object-oriented languages (Java, Python)
- Experience in working with different AWS technologies.
- Experience in software design, architecture and development.
- Excellent competencies in data structures & algorithms.
For (Backend):
- 2 - 10 years of experience
- Hands on product development experience using Java/ C++/Python
- Experience with AWS,SQL,GIT
- Strong with Data structures and Algorithms
Additional nice to have skills/certifications:
For Java skill set:
Mockito, Grizzly, Netty, VertX, Jersey / JAX-RS, Swagger / Open API, Nginx, Protocol Buffers, Thrift, Aerospike, Redis, Kinesis, Sed, Awk, Perl
For Python skill set: Data Engineering experience, Athena, Lambda, EMR, Spark, Glue, Step Functions, Hadoop, Kinesis, Orc, Parquet, Perl, Awk, Redshift
For (Data Engineering):
- 2 - 10 years of experience
- Experience with object-oriented/object function scripting languages: Python.
- Experience with AWS cloud services: EC2, RDS, Redshift,S3,Athena, Glue
- Must be proficient in GIT, Jenkins, CICD (Continuous Integration Continuous Deployment)
- Experience in big data technologies like Hadoop, Map Reduce, Spark, etc
- Experience with Amazon Web Services and Dockers
- BE Computer Science, MCA or equivalent
- Solid hands-on with JMeter or related tools.
- Knowledge of CI/CD implementation for Load Testing- JAVA applications. Prefer
MongoDB knowledge and hands-on.
- Knowledge and hands-on with complex enterprise cloud environment performance
testing.
- Experience in AWS Services.
- Familiarity with microservices.
- Understanding the non-functional requirements
- Evaluating the objectives of the service level agreement
- Analyzing business scenarios
- Designing the test scripts and the workload models
- Identifying parameter for testing
- Executing performance tests & establishing checkpoints
- Using consistent metrics for monitoring
- Interpret results and graphs.
- Understand describe the relationship between queues and sub-systems
- Identify suggestions for performance tuning
- Working with the cross-functional team, analyze/profile the application for performance
and scalability bottlenecks
Experience:
- Min 6-8 year experience
- Not more than15 years experience
Location
- Remotely, anywhere in India
Timings:
- 40 hours a week (11 AM to 7 PM).
Position:
- Full time/Direct
Other Benifits
- We have great benefits such as PF, medical insurance, 12 annual company holidays, 12
PTO leaves per year, annual increments, Diwali bonus, spot bonuses and other
incentives etc.
- We dont believe in locking in people with large notice periods. You will stay here
because you love the company. We have only a 15 days notice period.
● Integration of user-facing elements developed by front-end developers with server-side logic.
● Writing reusable, testable, and efficient code.
● Design and implementation of low-latency, high-availability, and performant applications.
● Implementation of security and data protection.
● Integration of data storage solutions.
● Familiar with Graphql, REST APIs, MongoDB, SQL, NoSQL, AWS services, Firebase. (Knowledge of web sockets is optional)
About us
404 DM is a fast-growing data-driven creative solutions provider, with a portfolio of brands that include Myntra, Flipkart, and Wildcraft. Our team essentially stands for all things digital with an abundance of creativity. The focus lies in building - right from crafting compelling narratives that capture a brand's essence to the long-lasting relationships that are developed with each and every client, the agency believes that this is the driving force behind the work they do. 404 DM offers a wide variety of services in the digital sphere, including design, technology, media planning, and above all, creative strategizing that helps brands stand out amidst their contemporaries.
We are building this really cool product that will change the way brands do their marketing and acquire customers. We're looking for equally driven and highly skilled people to join us. We'd love to have a conversation with you and see if you and our team are the right fit for each other.
Roles and Responsibilities
- Defines site objectives by analyzing user requirements; envisioning system features and functionality.
- Designs and develops user interfaces to Internet/intranet applications by setting expectations and features priorities throughout the development life cycle; determining design methodologies and toolsets; completing programming using languages and software products; designing and conducting tests.
- Recommends system solutions by comparing advantages and disadvantages of custom development and purchase alternatives.
- Integrates applications by designing database architecture and server scripting; studying and establishing connectivity with network systems, search engines, and information servers.
- Completes applications development by coordinating requirements, schedules, and activities; contributing to team meetings; troubleshooting development and production problems across multiple environments and operating platforms.
- Enhances organization reputation by accepting ownership for accomplishing new and different requests; exploring opportunities to add value to job accomplishments.
- Supports and develops web application developers by providing advice, coaching.
- SADs (Senior application Developer ) maintain the job duties of a supervisor as well as a lower-level application developer These duties can include regular consultations with the Team concerning software, creating new programs, and testing newly installed programs to verify functionality.
- The managerial side of being a leader can include writing reports on team progress and presenting those reports to upper executive committees, per Coding standards. SADs also make executive-level standards decisions for their department, ensure company policies are being followed and counsel employees as needed
Desired Candidate Profile
- Bachelor's or Master's degree in information technology or computer sciences. Combining full - time academics degree with certification in computer languages and software programs could be an added advantage.
- At least 5 to 8 years of experience as a leader with hands-on experience in programming languages such as Javascript, HTML, C++, PHP, Angular, Ajax, Rest Api, Node.Js, JQuery, MySQL, API Oath Integration, Shopify, Open Source app development, Social media Developer tool.
closely with the business to design and develop technology solutions around the
requirements.
Responsibilities:
● Lead the development of the backend systems for various products.
● Build reliable, secure and performant backend systems.
● Collaborating with the business to define the vision and implement the system
architecture, design and code.
● Help shape the backend development
Requirements:
● Should have 1-4 years of software development experience
● Strong computer science fundamentals
● Good intuition for REST API design
● Deep knowledge of the JavaScript ecosystem, should have hands-on experience of
writing code on NodeJS technology
● Having worked on frontend frameworks such as react and vue would be a plus.
● Understanding of DevOps would be helpful.










