11+ ADS Jobs in Delhi, NCR and Gurgaon | ADS Job openings in Delhi, NCR and Gurgaon
Apply to 11+ ADS Jobs in Delhi, NCR and Gurgaon on CutShort.io. Explore the latest ADS Job opportunities across top companies like Google, Amazon & Adobe.
A bit about us:
Tchyon's mission is to build the easiest way to Discover and Signup for Financial Services.
We're empowering the next generation of individuals and businesses to access any service
which is permitted by their sovereign identity, INSTANTLY! Because Time is Money ⏳.
What you will get:
- To be a part of the Core-Team 💪
- Creating High Impact by Solving a Problem at Large (No one in the World has a similar product) 💥
- High Growth Work Environment ⚙
What we are looking for:
- An 'Exceptional Executioner' -> Leader -> Create an Impact & Value
- Ability to take Ownership of your work
- Track record of launching growth initiatives
Role:
- Designing and executing all aspects of digital marketing campaigns including Push, SMS, email, PR, on-ground GTM, advertising campaign and Social
- Creating, curating and managing all the digital content including Blogs, Ads and Social Media creatives
- Optimizing content using SEO best practices.
- Ideating and coming up with clutter breaking video content concepts that can drive engagement and commerce in the platform
- Undertake consumer research to identify their needs and current friction points and come up with analysis or consumer understanding backed ideas to unlock growth.
- Deriving meaningful insights using various components and drivers of growth, along with trends (such as traffic, conversion, user engagement and repeat behavior) and create actionable programs leading to user growth
Qualifications:
- 1+ years of marketing/content experience
- Track record of launching growth initiatives
- Experience/curiosity in the Fintech space
Job Description:
As an Azure Data Engineer, your role will involve designing, developing, and maintaining data solutions on the Azure platform. You will be responsible for building and optimizing data pipelines, ensuring data quality and reliability, and implementing data processing and transformation logic. Your expertise in Azure Databricks, Python, SQL, Azure Data Factory (ADF), PySpark, and Scala will be essential for performing the following key responsibilities:
Designing and developing data pipelines: You will design and implement scalable and efficient data pipelines using Azure Databricks, PySpark, and Scala. This includes data ingestion, data transformation, and data loading processes.
Data modeling and database design: You will design and implement data models to support efficient data storage, retrieval, and analysis. This may involve working with relational databases, data lakes, or other storage solutions on the Azure platform.
Data integration and orchestration: You will leverage Azure Data Factory (ADF) to orchestrate data integration workflows and manage data movement across various data sources and targets. This includes scheduling and monitoring data pipelines.
Data quality and governance: You will implement data quality checks, validation rules, and data governance processes to ensure data accuracy, consistency, and compliance with relevant regulations and standards.
Performance optimization: You will optimize data pipelines and queries to improve overall system performance and reduce processing time. This may involve tuning SQL queries, optimizing data transformation logic, and leveraging caching techniques.
Monitoring and troubleshooting: You will monitor data pipelines, identify performance bottlenecks, and troubleshoot issues related to data ingestion, processing, and transformation. You will work closely with cross-functional teams to resolve data-related problems.
Documentation and collaboration: You will document data pipelines, data flows, and data transformation processes. You will collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and provide data engineering support.
Skills and Qualifications:
Strong experience with Azure Databricks, Python, SQL, ADF, PySpark, and Scala.
Proficiency in designing and developing data pipelines and ETL processes.
Solid understanding of data modeling concepts and database design principles.
Familiarity with data integration and orchestration using Azure Data Factory.
Knowledge of data quality management and data governance practices.
Experience with performance tuning and optimization of data pipelines.
Strong problem-solving and troubleshooting skills related to data engineering.
Excellent collaboration and communication skills to work effectively in cross-functional teams.
Understanding of cloud computing principles and experience with Azure services.
-
Create and maintain positive client relationships to build
a business. Understand client'sneeds and tailor products to meet client requirements.
Attend inbound/outbound sales calls for revenue generation
Achieve the periodic sales target
Perform up-selling and cross-selling of products to clients.
Build rapport with the client and create a potential pipeline for referral leads
Meet periodic targets for sales via referral leads
Conduct effective sales presentations for potential customers.
Maintain SLA protocols in day-to-day
functioning consistentlymeet the quality parameters such as avg talk time, working hours, evaluation score etc
Maintain high evaluation scores in the sales audits
Timely completion of daily tasks assigned
Ensure to call the clients at the allotted appointment slot.
Product repairing company(MNC)
A minimum of 6 years of software development experience
In-depth knowledge of C#
Hands-on experience with ASP. NET MVC Framework
Strong knowledge and experience in designing and developing Web solutions using ASP.
NET MVC Framework
Rich experience in creating REST-based web-services using ASP. NET Web API or ASP.
NET Core
Working experience of Relational Databases (MS SQL, Oracle nice to have), ORM (Entity
Framework, EF Core)
Affinity with unit and integration testing frameworks and TDD
Knowledge of Azure cloud and DevOps pipeline nice to have
Analyze, plan and estimate requirements as well as identify risks involved and provide
solutions
Understanding of design principles behind scalable and testable applications
Continuous improvement and innovative mind-set.
Ability to take a lead role and take ownership of complex data integration deliveries.
A degree in Computer Science and/or a business related degree; or equivalent work experience.
What you will do:
- Deriving insights from data and using the same for product improvements and iterations
- Breaking down the overall product strategy and roadmap into specific components that you delegate or own yourself
- Evaluating market opportunities and utilizing research to position/ reposition our products to best and identifying new opportunities and refining requirements
- Owning the product roadmap and lead growth initiatives and drive them to successful outcomes — from building, launching, and optimizing critical user flows
What you need to have:
- Minimum 1 to 5 years of B2C product management experience
- Experience designing and building metrics for product engagement and customer satisfaction and adept in user research, data- driven experimentation, sprints and rapid iteration
● Good working knowledge on Core PHP
● Strong knowledge of MVC Framework and knowledge of Laravel.
● Good working knowledge of JavaScript & JS frameworks, preferably jQuery
● Strong knowledge of MySQL databases & OOPS concepts.
● Strong knowledge of ORM and query builder.
● Knowledge of queue scheduling and task scheduling.
● Strong knowledge of Artisan commands.
● Hands on experience of service providers and create new service providers in the
application when needed.
● Integration of user-facing elements developed by front-end developers.
● Build efficient, testable, and reusable modules.
● Solve complex performance problems and architectural challenges.
● Write well designed, testable, efficient code by using best software development
practices.
● Integrate data from various back-end services and databases.
● Create and maintain software documentation.
● Knowledge of Rest APIs.
● Knowledge of the payment gateway integration (PayPal, PayUMoney, CC Avenue etc.)
● Good knowledge of wallet integration (Paytm, Phone-Pe, Amazon) etc.
● Proficient understanding of code versioning tools, such as Git is mandatory
● Ability to debug the code and resolve the issues.
Key Responsibilities:
Build reusable code and libraries for future use
Participate in the full lifecycle of the projects.
Optimize application for maximum speed and scalability
Ability to work independently and within a team environment
Assist in troubleshooting any production issues.
Working within an existing code base and also writing code from scratch.
Good team player -collaborate with other team members and stakeholders.
Commitment to meet corporate goals and tight deadlines.
Key Skills: HTML, CSS, Bootstrap, jQuery, JavaScript, PHP, Core PHP, knowledge of MVC
framework, database, MySQL, Laravel Framework, Rest APIs, 3rd party tool integration, OOPs
programing concept.