
About Hiteshi Infotech Pvt Ltd.
Similar jobs
Required Skills & Qualifications
- 5–7 years of hands-on Salesforce development experience.
- Strong expertise in Salesforce CPQ, Revenue Cloud, and Agentforce.
- Hands-on experience in Apex, LWC, SOQL, SOSL, Triggers, and API integrations.
- Solid understanding of Salesforce data model, workflows, validation rules, and approval processes.
- Experience with Agentforce implementation, including AI-driven insights, guided selling, and workflow automation.
- Salesforce certifications preferred: Platform Developer I/II, CPQ Specialist, Revenue Cloud Consultant, or Agentforce certification (if available).
Preferred Skills
- Exposure to Salesforce CPQ, Revenue Cloud, Agentforce and Experience Cloud.
- Experience with DevOps tools like Copado, Gearset, or Git for release management.
If interested kindly share your resume on 82008 31681
Role: Creative Associate
Exp: 1-4 Years
CTC: up to 6.50 LPA
Requirements -
- Need candidates who has done Cinematography whether in Photo shoot & Video shoot.
- Experience as a Creative associate in the wedding or events industry will be preffered.
- Manage Client Relationships
- Proven experience as a Creative associate in the wedding or events industry.
We are looking for a highly skilled Sr. Big Data Engineer with 3-5 years of experience in
building large-scale data pipelines, real-time streaming solutions, and batch/stream
processing systems. The ideal candidate should be proficient in Spark, Kafka, Python, and
AWS Big Data services, with hands-on experience in implementing CDC (Change Data
Capture) pipelines and integrating multiple data sources and sinks.
Responsibilities
- Design, develop, and optimize batch and streaming data pipelines using Apache Spark and Python.
- Build and maintain real-time data ingestion pipelines leveraging Kafka and AWS Kinesis.
- Implement CDC (Change Data Capture) pipelines using Kafka Connect, Debezium or similar frameworks.
- Integrate data from multiple sources and sinks (databases, APIs, message queues, file systems, cloud storage).
- Work with AWS Big Data ecosystem: Glue, EMR, Kinesis, Athena, S3, Lambda, Step Functions.
- Ensure pipeline scalability, reliability, and performance tuning of Spark jobs and EMR clusters.
- Develop data transformation and ETL workflows in AWS Glue and manage schema evolution.
- Collaborate with data scientists, analysts, and product teams to deliver reliable and high-quality data solutions.
- Implement monitoring, logging, and alerting for critical data pipelines.
- Follow best practices for data security, compliance, and cost optimization in cloud environments.
Required Skills & Experience
- Programming: Strong proficiency in Python (PySpark, data frameworks, automation).
- Big Data Processing: Hands-on experience with Apache Spark (batch & streaming).
- Messaging & Streaming: Proficient in Kafka (brokers, topics, partitions, consumer groups) and AWS Kinesis.
- CDC Pipelines: Experience with Debezium / Kafka Connect / custom CDC frameworks.
- AWS Services: AWS Glue, EMR, S3, Athena, Lambda, IAM, CloudWatch.
- ETL/ELT Workflows: Strong knowledge of data ingestion, transformation, partitioning, schema management.
- Databases: Experience with relational databases (MySQL, Postgres, Oracle) and NoSQL (MongoDB, DynamoDB, Cassandra).
- Data Formats: JSON, Parquet, Avro, ORC, Delta/Iceberg/Hudi.
- Version Control & CI/CD: Git, GitHub/GitLab, Jenkins, or CodePipeline.
- Monitoring/Logging: CloudWatch, Prometheus, ELK/Opensearch.
- Containers & Orchestration (nice-to-have): Docker, Kubernetes, Airflow/Step
- Functions for workflow orchestration.
Preferred Qualifications
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- Experience in large-scale data lake / lake house architectures.
- Knowledge of data warehousing concepts and query optimisation.
- Familiarity with data governance, lineage, and cataloging tools (Glue Data Catalog, Apache Atlas).
- Exposure to ML/AI data pipelines is a plus.
Tools & Technologies (must-have exposure)
- Big Data & Processing: Apache Spark, PySpark, AWS EMR, AWS Glue
- Streaming & Messaging: Apache Kafka, Kafka Connect, Debezium, AWS Kinesis
- Cloud & Storage: AWS (S3, Athena, Lambda, IAM, CloudWatch)
- Programming & Scripting: Python, SQL, Bash
- Orchestration: Airflow / Step Functions
- Version Control & CI/CD: Git, Jenkins/CodePipeline
- Data Formats: Parquet, Avro, ORC, JSON, Delta, Iceberg, Hudi
About Us:
Bansal Classes in Chinchwad is a renowned educational institute specializing in IIT, JEE, and NEET coaching. We are seeking an experienced Branch Manager to join our team and contribute to our mission of nurturing future leaders.
Job Description:
· The Manager is responsible for the sales for the month target, customer service, security and safety in accordance with the branch objectives.
· Develops new business, provides a superior level of customer service.
· The successful candidate will have a hands-on approach and will be committed to the expansion and success of the branch by implementing strategies that increase productivity and enable business achieve
· 85% Sales and 15% Operations
· Achieve the sales target in day & monthly basis
· Team handling
· Recording and Reporting to your higher official Head
Requirements:
- Minimum of 3 years of experience in Branch Manager, preferably in the education sector.
- Excellent verbal and written communication skills.
- Empathy and the ability to connect with students and parents.
- Understanding of the IIT, JEE, and NEET coaching industry is a plus.
- Strong organizational skills and attention to detail.
- A passion for education and a commitment to the success of our students.
- The ability to handle branch responsibilities effectively.
Django Developer (Backend Developer)_Hyderabad Location
Prelude
We are BeyondScale, on a mission to build a mobile learning app to help organizations create internal courses for their workforce easily. eLearning is booming and we aim to tap into the under-served non-IT L&D market and make a difference in the livelihoods of millions of people.
Job Description:
- 2+ years of experience coding with Python.
- Design, build, and maintain efficient, reusable, and reliable code.
- Eager and proactive to learn new technical skills.
- Hands-on experience of developing web APIs and writing database queries in PostgreSQL (MongoDB, MySQL and DynamoDB is a plus).
- Good understanding of OOPs, Multiprocessing and threading.
- Proficient in testing and debugging programs.
- Well-versed with Git and modern development workflow practices
Senior Node JS developer
Job Description
- Working experience in Nodejs, MySQL, Postgres SQL
- Good knowledge of designing and writing restful API
- Working experience of GIT
- Will prefer candidates who have knowledge of GraphQL, Typescript
- Well-versed and experienced with unit testing, code coverage.
- Well versed with SOILD design principles and their application
- Knowledge of AuthO, typeORM required
- Good to have – some experience with Azure DevOps and/or azure functions.
-Working experience on Docker- good to have
We are looking for a PHP Developer responsible for managing back-end services and the interchange of data between the server and the users.
Responsibilities and duties:
- Develop, record, and maintain cutting edge web-based PHP applications on portal plus premium service platforms.
- Ensure HTML, CSS, and shared JavaScript is valid and consistent across applications.
- Utilize backend data service.
Skills and Knowledge:
- Strong knowledge of PHP
- Knowledge of HTML, CSS, and JavaScript, etc
- Writing well designed, testable, efficient code by using best software development practices
- Knowledge of WordPress, Laravel, and CodeIgniter (preferred)
- Knowledge of any one framework of PHP (Required)
Programming Languages needed:
- HTML
- CSS
- JavaScript
- PHP (Required)
- Laravel/WordPress/CodeIgniter (preferred)
Who Can Apply:
- Who has a minimum experience of 6 months
- Who has Maximum experience of 1 year only
- Who are from Surat
- Who are willing to relocate to Surat
- Who have relevant Skills & Knowledge
Freshers Are Also Welcome !!!
***** SURAT CANDIDATES ARE MORE PREFERRED *****
This includes working on:
a) The main Django application, a large, modern, Django app built using Python 3.8 and the latest Python and Django libraries;
b) The API, built using Django Rest Framework (DRF) that is used both by our web-app and client libraries to build and run data analyses;
c) Backend code that integrates our web server with the rest of our cloud architecture, including our PaaS, data science code, general integrations such as payments, devops code, and more.
Ideally, you should have experience working on Django codebases which serve both server-side rendered pages and APIs via DRF. Frontend/full-stack knowledge is a an advantage but not essential. Familiarity with modern development practices, such as CI/CD, testing, DevOps, Docker, Linux and git would be a big plus. You must have very strong familiarity with Python development, and be excited to pick up the new technologies and skills - for instance we use Python type-hints across our codebase extensively.
You should like the idea of releasing to real customers regularly, and prioritise getting a great product into users’ hands for feedback and iteration. You will have extensive scope to build and architect the backend, and to help grow the team in the future.










