Description
- Dynamic Professional in Leading & Driving the Telecalling Team, setting up Revenue Targets in line with Organisational Goals.
- Motivating & Mentoring team to achieve & Exceed Individual & Team Targets.
- Ability to forge alliances quickly.
- Provide team with a vision and objectives and manage key performance indicators.
- Set the monthly, weekly, daily Targets for the team and ensure that the targets are achieved.
Requirements
- MBA/PGDM or B.Tech/BE from recognized institutes. B.Com or equivalent graduates from reputed colleges can also apply.
- 3 - 6 years of Business to Consumer (B2C) tele-sales experience is preferable.
- Awareness of the subjects taught from classes 5 to 12 across various boards/curriculum globally.
- Sound knowledge and understanding of the challenges students face in schools/coaching classes.
Benefits
- Weekly incentives: Opportunity to earn incentives on a weekly basis.
- Perks and bonuses: Contests, parties, trips and much more!
- Medical Insurance: We provide medical insurance to all our employees.
- Custodian of a bright future: Create the right educational path for students.Help them overcome their obstacles to learning by using Toppr.
- As the business grows, you grow: We want Toppr to be built from within. We look at you as someone with the potential to become a future sales leader.
- Learn from the best: Learn from leaders whose teams have brought over 25x growth over the last 2 years.

Similar jobs


- Should have excellent knowledge of Swift and Objective C
- Good working knowledge in Cocoa Touch
- Experience with performance and memory tuning with tools
- Experience with memory management & caching mechanisms specific to mobile devices
- Experience with third-party libraries and APIs
- Experience working with Core Data, Realm
- Understanding of the full mobile development life cycle
- Experience in publishing apps to the App Store.
- Code version tool – Git, Bitbucket
- Design Pattern - MVC and MVVM, MVP
You will be responsible for designing, building, and maintaining data pipelines that handle Real-world data at Compile. You will be handling both inbound and outbound data deliveries at Compile for datasets including Claims, Remittances, EHR, SDOH, etc.
You will
- Work on building and maintaining data pipelines (specifically RWD).
- Build, enhance and maintain existing pipelines in pyspark, python and help build analytical insights and datasets.
- Scheduling and maintaining pipeline jobs for RWD.
- Develop, test, and implement data solutions based on the design.
- Design and implement quality checks on existing and new data pipelines.
- Ensure adherence to security and compliance that is required for the products.
- Maintain relationships with various data vendors and track changes and issues across vendors and deliveries.
You have
- Hands-on experience with ETL process (min of 5 years).
- Excellent communication skills and ability to work with multiple vendors.
- High proficiency with Spark, SQL.
- Proficiency in Data modeling, validation, quality check, and data engineering concepts.
- Experience in working with big-data processing technologies using - databricks, dbt, S3, Delta lake, Deequ, Griffin, Snowflake, BigQuery.
- Familiarity with version control technologies, and CI/CD systems.
- Understanding of scheduling tools like Airflow/Prefect.
- Min of 3 years of experience managing data warehouses.
- Familiarity with healthcare datasets is a plus.
Compile embraces diversity and equal opportunity in a serious way. We are committed to building a team of people from many backgrounds, perspectives, and skills. We know the more inclusive we are, the better our work will be.


About the company:
Gigmo Solutions is a fast growing tech. startup with a mission to fundamentally disrupt customer support industry through perfectly tuned symphony of Artificial Intelligence based conversational bots and Gig workers.
With our engineers spread out in 10+ countries across 4 continents, Gigmos is uniquely poised to fundamentally change the technical customers support industry.
Role - Python Developer
Experience 3+ Years
Work Location – Gurugram (Haryana)
Responsibilities
- Writing reusable, testable, and efficient code
- Design and implementation of low-latency, high-availability, and performant applications
- Integration of user-facing elements developed by front-end developers with server-side logic
- Developing RESTful APIs
- Implementation of security and data protection
- Integration of data storage solutions ( Postgresql,Mysql )
- Design, develop, and maintain web scraping scripts using Python.
- Use web scraping libraries like Beautiful Soup, Scrapy, Selenium and other scraping tools to extract data from websites.
Skills And Qualifications
- Expert in Python, with knowledge of Python web frameworks (Django, Flask, Fast API)
- Familiarity with some ORM libraries -SQL Alchemy /Tortoise
- Able to integrate multiple data sources and databases into one system.
- Understanding of the threading limitations of Python, and multi-process architecture
- Good understanding of REST APIs.
- Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3
- Understanding of accessibility and security compliance
- Knowledge of user authentication and authorization between multiple systems, servers, and environments
- Understanding of fundamental design principles behind a scalable application
- Familiarity with event-driven programming in Python
- Understanding of the differences between multiple delivery platforms, such as mobile vs desktop, and optimizing output to match the specific platform.
- Able to create database schemas that represent and support business processes
- Strong unit test and debugging skills.
- Proficient understanding of code versioning tools ( Git )
- Familiarity with real-time databases ( PostgreSQL, MySQL )
- Must have worked in the field for at least 3+ years.
- Requires a bachelor's degree in computer science, Software Engineering or a related field from a good Institute (Tier-1, Tier -2 Colleges)
- Experience with Python development and web scraping techniques.
CTC: As per Industry Standards.


- Java,Jdbc,J2ee,Ajax,jquery,Spring,Hibernate, Restful Web Services, JAVASCRIPT, HTML, CSS
- T-SQL and PL/SQL
- Strong experience in core java
- Experience in OOAD frameworks, Spring Integration, Hibernate, JSP, Tiles, Applets, Servlets
- Extensive development experience with PL/PGSQL (PostgreSQL)
- Good experience in code optimization, performance tuning, debugging
- Web/App server: TOMCAT/JBOSS/WAS
- Knowledge of OOPS concepts, SDLC, and design patterns
- Good Analytical ability and Problem Solving Skills
- Good communication skills in both written and verbal
- Java Certification will be an added advantage

- 5+ years of experience in a Data Engineering role on cloud environment
- Must have good experience in Scala/PySpark (preferably on data-bricks environment)
- Extensive experience with Transact-SQL.
- Experience in Data-bricks/Spark.
- Strong experience in Dataware house projects
- Expertise in database development projects with ETL processes.
- Manage and maintain data engineering pipelines
- Develop batch processing, streaming and integration solutions
- Experienced in building and operationalizing large-scale enterprise data solutions and applications
- Using one or more of Azure data and analytics services in combination with custom solutions
- Azure Data Lake, Azure SQL DW (Synapse), and SQL Database products or equivalent products from other cloud services providers
- In-depth understanding of data management (e. g. permissions, security, and monitoring).
- Cloud repositories for e.g. Azure GitHub, Git
- Experience in an agile environment (Prefer Azure DevOps).
Good to have
- Manage source data access security
- Automate Azure Data Factory pipelines
- Continuous Integration/Continuous deployment (CICD) pipelines, Source Repositories
- Experience in implementing and maintaining CICD pipelines
- Power BI understanding, Delta Lake house architecture
- Knowledge of software development best practices.
- Excellent analytical and organization skills.
- Effective working in a team as well as working independently.
- Strong written and verbal communication skills.
- Expertise in database development projects and ETL processes.

- Establish and maintain a trusted advisor relationship within the company’s IT, Commercial Digital Solutions, Functions, and Businesses you interact with
- Establish and maintain close working relationships with teams responsible for delivering solutions to the company’s businesses and functions
- Perform key management and thought leadership in the areas of advanced data techniques, including data modeling, data access, data integration, data visualization, big data solutions, text mining, data discovery, statistical methods, and database design
- Work with business partners to define ways to leverage data to develop platforms and solutions to drive business growth
- Engage collaboratively with project teams to support project objectives through the application of sound data architectural principles; support the project with knowledge of existing data assets and provide guidance on reusable data structures
- Share knowledge of external and internal data capabilities and trends, provide leadership, and facilitate the evaluation of vendors and products
- Utilize advanced data analysis, including statistical analysis and data mining techniques
- Collaborate with others to set an enterprise data vision with solid recommendations, and work to gain business and IT consensus
Basic Qualifications
- Overall 15+ years of IT Environment Experience.
- Solid background as a data architect/cloud architect with a minimum of 5 years as a core architect
- Architecting experience in the cloud data warehouse solutions (Snowflake preferred, big query/redshift/synapse analytics-nice to have)
- Strong architecture and design skills using Azure Services( Like ADF/Data Flows/Evengrids/IOTHUB/EvenHub/ADLS Gen2 /Serverless Azure Functions/Logic Apps/Azure Analysis Services Cube Design patterns/Azure SQL Db)
- Working Knowledge on Lambda/Kappa frameworks within the data Lake designs& architecture solutions
- Deep understanding of the DevOps/DataOps patterns
- Architecting the semantic models
- Data modeling experience with Data vault principles
- Cloud-native Batch & Realtime ELT/ETL patterns
- Familiarity with Lucid Chart/Visio
- Logging & monitoring pattern designs in the data lake /data warehouse context
- Meta data-driven design patterns * solutioning expertise
- Data Catalog integration experience in the data lake designs
- 3+ years of experience partnering with business managers to develop technical strategies and architectures to support their objectives
- 2+ years of hands-on experience with analytics deployment in the cloud (prefer Azure but AWS knowledge is acceptable)
- 5+ years of delivering analytics in modern data architecture (Hadoop, Massively Parallel Processing Database Platforms, and Semantic Modeling)
- Demonstrable knowledge of ETL and ELT patterns and when to use either one; experience selecting among different tools that could be leveraged to accomplish this (Talend, Informatica, Azure Data Factory, SSIS, SAP Data Services)
- Demonstrable knowledge of and experience with different scripting languages (python, javascript, PIG, or object-oriented programming like Java or . NET)
Preferred Qualifications
- Bachelor’s degree in Computer Science, MIS, related field, or equivalent experience
- Experience working with solutions delivery teams using Agile/Scrum or similar methodologies
- 2+ years of experience designing solutions leveraging Microsoft Cortana Intelligence Suite of Products [Azure SQL, Azure SQL DW, Cosmos DB, HDInsight, DataBricks]
- Experience with enterprise systems, like CRM, ERP, Field Services Management, Supply Chain solutions, HR systems
- Ability to work independently, establishing strategic objectives, project plans, and milestones
- Exceptional written, verbal & presentation skills
We (the LearnApp engineers) are a bunch of nerds who directly impact the daily lives of thousands of learners across the globe. As software engineer, we are the architect, designer, coder and maintainer of our projects. In other words, we are the owner of the product lifecycle. We don't just write code but actually we work closely with the product team to make vision become reality.
Responsibilities
- Analyze functional requirements and creation of software design
- Develop responsive and sleek web applications
- Design, develop, test, deploy, maintain, and improve software
Qualifications
- Bachelor's degree in Computer Science, a related technical field of study, or equivalent practical experience
- Experience in Object-Oriented Design and programming concepts
- A minimum of 3 years of working experience in a product oriented company
- Programming experience in Typescript or Javascript
- Experience in building responsive and fast user interfaces using modern web technologies
Preferred Qualification
- Experience in building SPA and PWA
- Experience building modern and good looking UI using latest CSS and SCSS
- Knowledge of Agile software development methodologies
- Ability to learn other coding languages as needed
- Object-oriented, database design
Hands on in Power BI & Power Tool
Build Analysis Services reporting models.
Develop visual reports, dashboards and KPI scorecards using Power BI desktop.
Connect to data sources, importing data and transforming data for Business Intelligence.
Analytical thinking for translating data into informative visuals and reports.
Implement row level security on data and have an understanding of application security layer models in Power BI.
Make DAX queries in Power BI desktop.


