Forward creative specs to creative agencies and ensure creatives is built correctly
Responsible for executing tag generation and implementation (creative and conversion tags) as well as report generation as per the campaign schedule
Troubleshooting creative and operations related issues for all campaign under your management
Provide the highest standard of accuracy and quality of work generating the best possible experience for internal and external customers.
Similar jobs
Description
Come Join Us
Experience.com - We make every experience matter more
Position: Senior GCP Data Engineer
Job Location: Chennai (Base Location) / Remote
Employment Type: Full Time
Summary of Position
A Senior Data Engineer is a professional who specializes in preparing big data infrastructure for analytical or operational uses. He/She is responsible for develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity. They collaborate with data scientists and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organisation.
Responsibilities:
- Collaborate with cross-functional teams to define, prioritize, and execute data engineering initiatives aligned with business objectives.
- Design and implement scalable, reliable, and secure data solutions by industry best practices and compliance requirements.
- Drive the adoption of cloud-native technologies and architectural patterns to optimize the performance, cost, and reliability of data pipelines and analytics solutions.
- Mentor and lead a team of Data Engineers.
- Demonstrate a drive to learn and master new technologies and techniques.
- Apply strong problem-solving skills with an emphasis on building data-driven or AI-enhanced products.
- Coordinate with ML/AI and engineering teams to understand data requirements.
Experience & Skills:
- 8+ years of Strong experience in ETL and ELT data from various sources in Data Warehouses
- 8+ years of experience in Python, Pandas, Numpy, and SciPy.
- 5+ years of Experience in GCP
- 5+ years of Experience in BigQuery, PySpark, and Pub/Sub
- 5+ years of Experience working with and creating data architectures.
- Certified in Google Cloud Professional Data Engineer.
- Advanced proficiency in Google Cloud services such as Dataflow, Dataproc, Dataprep, Data Studio, and Cloud Composer.
- Proficient in writing complex Spark (PySpark) User Defined Functions (UDFs), Spark SQL, and HiveQL.
- Good understanding of Elastic search.
- Experience in assessing and ensuring data quality, data testing, and addressing data quality issues.
- Excellent understanding of Spark architecture and underlying frameworks including storage management.
- Solid background in database design and development, database administration, and software engineering across full life cycles.
- Experience with NoSQL data stores like MongoDB, DocumentDB, and DynamoDB.
- Knowledge of data governance principles and practices, including data lineage, metadata management, and access control mechanisms.
- Experience in implementing and optimizing data security controls, encryption, and compliance measures in GCP environments.
- Ability to troubleshoot complex issues, perform root cause analysis, and implement effective solutions in a timely manner.
- Proficiency in data visualization tools such as Tableau, Looker, or Data Studio to create insightful dashboards and reports for business users.
- Strong communication and interpersonal skills to effectively collaborate with technical and non-technical stakeholders, articulate complex concepts, and drive consensus.
- Experience with agile methodologies and project management tools like Jira or Asana for sprint planning, backlog grooming, and task tracking.
Requirements
- 3+ years work experience with production-grade python. Contribution to open source repos is preferred
- Experience writing concurrent and distributed programs, AWS lambda, Kubernetes, Docker, Spark is preferred.
- Experience with one relational & 1 non-relational DB is preferred
- Prior work in the ML domain will be a big boost
What You’ll Do
- Help realize the product vision: Production-ready machine learning models with monitoring within moments, not months.
- Help companies deploy their machine learning models at scale across a wide range of use-cases and sectors.
- Build integrations with other platforms to make it easy for our customers to use our product without changing their workflow.
- Write maintainable, scalable performant python code
- Building gRPC, rest API servers
- Working with Thrift, Protobufs, etc.
Job Description:
- This is BPO Night shift job (US Voice process) in Nagercoil.
- This is purely night shift with fix saturday sunday off .
- This is not sales or tele marketing, it is to help the US citizens .
- It is work from office only. with salary range 15000 to 25000 per month along with unlimited incentives based on leads that you generate.(per lead you get Rs 500)
Responsibilities:
- Handle outbound calls to international customers.
- This is US Government project that you will be working on where you get the details of customers and complete the further process.
- Maintain accurate and detailed records of customer interactions and transactions.
- Collaborate with team members to achieve individual and team goals.
- Strive to achieve customer satisfaction and ensure positive feedback.
Requirements:
- Freshers and Experienced both can apply.
- Excellent communication skills / Fluency in English.
- Ensure Timely& Professional Responses to all queries.
- Strong ability to multitask and take fast decisions independently.
- Night shift only.(7.30 PM to 4.30 AM.)
Benefits:
- Competitive salary + incentives.
- After shift Drop facility for females only.
- ESI, PF, and insurance benefits
Sr. Java Software Engineer:
Preferred Education & Experience:
- Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
- Well-versed in and 5+ years of hands-on designing experience in Object Oriented Design, Data Modeling, Class & Object Modeling, Microservices Architecture & Design.
- Well-versed in and 5+ years of hands-on programming experience in Core Java Programming, Advanced Java Programming, Spring Framework, Spring Boot or Micronaut Framework, Log Framework, Build & Deployment Framework, etc
. • 3+ years of hands-on experience developing Domain-Driven Microservices using libraries & frameworks such as Micronaut, Spring Boot, etc.
- 3+ years of hands-on experience developing connector frameworks Apache Camel, Akka framework, etc.
- 3+ years of hands-on experience in RBDMS & NoSQL Databases concepts and development practices (PostgreSQL, MongoDB, Elasticsearch, Amazon S3).
- 3+ years of hands-on experience developing Webservices using REST, API Gateway using Token based authentication, access management.
- 1+ years of hands-on experience developing and hosting microservices using Serverless and Container based development (AWS Lambda, Docker, Kubernetes, etc.).
- Having Knowledge & hands-on experience developing applications using Behavior Driven Development, Test Driven Development Methodologies is a Plus.
- Having Knowledge & hands-on experience in AWS Cloud Services such as IAM, Lambda, EC2, ECS, ECR, API Gateway, S3, SQS, Kinesis, CloudWatch, DynamoDB, etc. is also a Plus.
- Having Knowledge & hands-on experience in DevOps CI/CD tools such as JIRA, Git (Bitbucket/GitHub), Artifactory, etc. & Build tools such as Maven & Gradle.
- 2+ years of hands-on development experience in Java centric Developer Tools, Management & Governance, Networking and Content Delivery, Security, Identity, and Compliance, etc.
- Having Knowledge & hands-on experience in Apache Nifi, Apache Spark, Apache Flink is also a Plus. • Having Knowledge & handson experience in Python, NodeJS, Scala Programming is also a Plus. Required Experience: 5+ Years
Job Location: Remote / Pune
Open Positions: 1
Requirement:
1. Node Js min 2 yrs exp.
2. Database - MONGO, SQL, etc. min 2yrs experience with these.
3. Caching - REDIS, MEMCACHED etc
4. Message Queues - RABBIT MQ, Kafka, etc.
Location: Delhi (Work from office).
Package : Upto 12 LPA
Responsibilities:
- Must be able to write quality code and build secure, highly available systems.
- Assemble large, complex datasets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing datadelivery, re-designing infrastructure for greater scalability, etc with the guidance.
- Create datatools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Monitoring performance and advising any necessary infrastructure changes.
- Defining dataretention policies.
- Implementing the ETL process and optimal data pipeline architecture
- Build analytics tools that utilize the datapipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Create design documents that describe the functionality, capacity, architecture, and process.
- Develop, test, and implement datasolutions based on finalized design documents.
- Work with dataand analytics experts to strive for greater functionality in our data
- Proactively identify potential production issues and recommend and implement solutions
Skillsets:
- Good understanding of optimal extraction, transformation, and loading of datafrom a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Proficient understanding of distributed computing principles
- Experience in working with batch processing/ real-time systems using various open-source technologies like NoSQL, Spark, Pig, Hive, Apache Airflow.
- Implemented complex projects dealing with the considerable datasize (PB).
- Optimization techniques (performance, scalability, monitoring, etc.)
- Experience with integration of datafrom multiple data sources
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, etc.,
- Knowledge of various ETL techniques and frameworks, such as Flume
- Experience with various messaging systems, such as Kafka or RabbitMQ
- Good understanding of Lambda Architecture, along with its advantages and drawbacks
- Creation of DAGs for dataengineering
- Expert at Python /Scala programming, especially for dataengineering/ ETL purposes
-1+ years of experience in C#, ASP.NET,reactjs,javascript
-Strong experience in MVC web frameworks, Web Forms
-Hands on experience in .NET Core and .NET Framework 4.0 or above including ADO.NET
-Knowledge in an Agile software development
-Experience in working with tools such as Docker, Kubernetes, PowerShell, Maven, Jenkins and SCM tools like GIT, SVN, TFS etc
-Familiarity with design and architectural patterns, SOA design & web service development
--Maintain a high level of exposure for the property through direct sales.
--Successfully create business from new and existing guests.
--To connect with the CP, brokers and prospective buyers.
--Shall have good communication skills.
--Shall coordinate with the customers and resolve their queries reg. project.
FULL STACK DEVELOPER
In summary, we're looking for a software developer who will be responsible for the development of our user-facing application and associated web services using MERN stack.
Applicant needs to have good knowledge of programming fundamentals and already know Javascript (or is confident about picking it up). Candidates having at least one year of experience in front-end or back-end development are preferred.
JOB RESPONSIBILITIES:
-
Working closely with primary stakeholders to gather requirements and plan technical solutions.
-
Development and testing of front-end and back-end components, and writing maintainable codebase.
-
DevOps ie, enabling automation of sharing and building code.
-
Documenting and enabling organizational structure and various channels of communication.
-
Communicating thorough requirements and issues with the software system.
CANDIDATES MUST HAVE:
-
Strong programming fundamentals and demonstrable experience through work or projects in any programming language
-
An interest in working in a startup environment
-
At least one year of work experience in the industry or on a project with multiple members
-
A good understanding of web development fundamentals including JSON, REST, HTTP, client/ server, web servers, proxies, reverse proxies, etc
-
A good understanding of Javascript. Alternatively, experience with other languages/ frameworks that would enable quick learning of Javascript will suffice
-
Strong and professional communication skills
computing, and SaaS
• Structured thinker, effective communicator, with excellent programming and analytic skills
• Strategic mind with strong operational, project management and technical architecture skills
• A track record of highly influential technical and leadership achievements
• Demonstrated skill in aligning application decisions to an overarching solution and systems
architecture
• Substantial experience leading application design efforts as a senior or lead software engineer
• Deep hands-on experience in Microsoft technologies stack such as C#, ASP.NET, MVC, WCF,
Web API etc.
• Experience in Unit Test Automation. TDD/ BDD.
• Experience in CI/ CD using any tools like Jenkins, TeamCity, Azure devOps etc.
• Expertise in RESTful API, SOA, Microservice and integration architecture and design
• Nice to have exposure on .Net Core, Docker & Kubernetes