
build up your relations with clients
lead generate
- Assess the strengths and weaknesses of the sales team and manage the sales program accordingly
- Monitor the program on 500+ accounts across all domestic sales channels and pinpoint ways to penetrate new markets
- Provide on-the-ground support for sales associates as they generate leads and close new deals
- Meet with customers to discuss their evolving needs and to assess the quality of our company’s relationship with them
- Develop and implement new sales initiatives, strategies and programs to capture key demographics
- Provide daily report of field sales success and communicate VOC data to superiors

About roof and assets infra pvt ltd
About
Connect with the team
Similar jobs
Company Name - Bhanzu formerly (Exploring Infinities).
Website - https://www.bhanzu.com/
Mode- WFO
Job Description:
Roles and Responsibilities:
● Willing to familiarize yourself with the company's vision and mission seeking to accomplish set goals and objectives
● Collating and maintaining client information in the CRM database
● Make calls to clients and respond to callback requests
● Email & WhatsApp conversations with potential leads
● Make potential leads understand our courses
● Convert potential leads into customers
● Learning & using our customer relations management software & others
● Related computer software
Requirements:
● Excellent written and verbal communication skills
● Ability to work under pressure
● Laptop is mandatory
● Willing to work in a startup environment (fast-paced)
● No. of days Working: 6 (Rotational off MON- FRI & Sat and Sun Compulsory working)
About the Role:
We are looking for a skilled Full Stack Developer (Python & React) to join our Data & Analytics team. You will design, develop, and maintain scalable web applications while collaborating with cross-functional teams to enhance our data products.
Responsibilities:
- Develop and maintain web applications (front-end & back-end).
- Write clean, efficient code in Python and TypeScript (React).
- Design and implement RESTful APIs.
- Work with Snowflake, NoSQL, and streaming data platforms.
- Build reusable components and collaborate with designers & developers.
- Participate in code reviews and improve development processes.
- Debug and resolve software defects while staying updated with industry trends.
Qualifications:
- Passion for immersive user experiences and data visualization tools (e.g., Apache Superset).
- Proven experience as a Full Stack Developer.
- Proficiency in Python (Django, Flask) and JavaScript/TypeScript (React).
- Strong understanding of HTML, CSS, SQL/NoSQL, and Git.
- Knowledge of software development best practices and problem-solving skills.
- Experience with AWS, Docker, Kubernetes, and FaaS.
- Knowledge of Terraform and testing frameworks (Playwright, Jest, pytest).
- Familiarity with Agile methodologies and open-source contributions.
- Social Media Rockstar: Dream up engaging posts, stories, and reels for Facebook, Instagram, and beyond. Think memes, eye-catching visuals, and content that gets the cycling community buzzing.
- Design Guru: Become a master of visual storytelling! Design graphics and photos that make our website and social media pop.
- Content Chameleon: You'll be crafting catchy blog posts, articles, and other written content that showcases the cycling lifestyle and, naturally, the awesomeness of ONN Bikes.
- Trend Spotter: Keep your finger on the pulse of the latest social media trends and content marketing magic.
- Team Player: Work hand-in-hand with our marketing and sales crew to ensure all content aligns with our brand and crushes our marketing goals.
About Us:
We’re looking to hire someone to help scale Machine Learning and NLP efforts at Episource. You’ll work with the team that develops the models powering Episource’s product focused on NLP driven medical coding. Some of the problems include improving our ICD code recommendations , clinical named entity recognition and information extraction from clinical notes.
This is a role for highly technical engineers who combine outstanding oral and written communication skills, and the ability to code up prototypes and productionalize using a large range of tools, frameworks, and languages. Most importantly they need to have the ability to autonomously plan and organize their work assignments based on high-level team goals.
What you will do at Episource:
You will be responsible for setting an agenda to develop and build machine learning platforms that positively impact the business, working with partners across the company including operations and engineering. You will be working closely with the machine learning team to design and implement back end components and services. You will be evaluating new technologies, enhancing the applications, and providing continuous improvements to produce high quality software.
Required Skills:
-
Strong background in analytics, BI or data science deployments is preferable with 2-6 years of experience
-
Knowledge of React/Vue, HTML, CSS
-
Experience building and consuming APIs
-
Experience with MySQL, MongoDB and MEAN stack
-
Knowledge and experience with serverless architectures is a plus
-
Hands-on experience with AWS or any major cloud service provider for deploying solutions
-
Experience with Docker or Kubernetes in deploying solutions on the cloud
-
Hands-on experience Python, Apache Spark & Big Data platforms to manipulate large-scale structured and unstructured datasets.
-
Fluent in data fundamentals: SQL, data manipulation using a procedural language, statistics, experimentation, and modeling
AWS Glue Developer
Work Experience: 6 to 8 Years
Work Location: Noida, Bangalore, Chennai & Hyderabad
Must Have Skills: AWS Glue, DMS, SQL, Python, PySpark, Data integrations and Data Ops,
Job Reference ID:BT/F21/IND
Job Description:
Design, build and configure applications to meet business process and application requirements.
Responsibilities:
7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.
Technical Experience:
Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.
➢ Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.
➢ Create data pipeline architecture by designing and implementing data ingestion solutions.
➢ Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.
➢ Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.
➢ Author ETL processes using Python, Pyspark.
➢ Build Redshift Spectrum direct transformations and data modelling using data in S3.
➢ ETL process monitoring using CloudWatch events.
➢ You will be working in collaboration with other teams. Good communication must.
➢ Must have experience in using AWS services API, AWS CLI and SDK
Professional Attributes:
➢ Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.
➢ Must have 6+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.
➢ Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.
Qualification:
➢ Degree in Computer Science, Computer Engineering or equivalent.
Salary: Commensurate with experience and demonstrated competence
Strong hands-on experience with ReactJS, Redux, TypeScript, HTML, Bootstrap, Material UI, CSS3/SCSS, Webpack, NPM, NVM, Visual Studio Code
Knowledge/Experience on Restful APIs integration.
Unit testing with Jasmine or an equivalent framework
Experience or Knowledge in developing complex reusable component libraries such as complex grids, spreadsheet components etc.,
Strong in UI/UX standards implementation
Understanding of browser fundamentals and Page Optimizations
Experience working with Git and Azure
Hands on experience in:
- Deploying, managing, securing and patching enterprise applications on large scale in Cloud preferably AWS.
- Experience leading End-to-end DevOps projects with modern tools encompassing both Applications and Infrastructure
- AWS Code deploy, Code build, Jenkins, Sonarqube.
- Incident management and root cause analysis.
- Strong understanding of immutable infrastructure and infrastructure as code concepts. Participate in capacity planning and provisioning of new resources. Importing already deployed infra into IaaC.
- Utilizing AWS cloud services such as EC2, S3, IAM, Route53, RDS, VPC, NAT/IG Gateway, LAMBDA, Load Balancers, CloudWatch, API Gateway are some of them.
- AWS ECS managing multi cluster container environments (ECS with EC2 and Fargate with service discovery using Route53)
- Monitoring/analytics tools like Nagios/DataDog and logging tools like LogStash/SumoLogic
- Simple Notification Service (SNS)
- Version Control System: Git, Gitlab, Bitbucket
- Participate in Security Audit of Cloud Infrastructure.
- Exceptional documentation and communication skills.
- Ready to work in Shift
- Knowledge of Akamai is Plus.
- Microsoft Azure is Plus
- Adobe AEM is plus.
- AWS Certified DevOps Professional is plus
We are looking to hire an experienced Frontend Developer with a very good grasp over Javascript and the capability to make responsive pages.
The candidate should have good knowledge of:
- Javascript
- HTML & CSS
- Node js
- PWA
- Electron JS platform.
- API Integration
Candidates having good logical skills are preferred.










