
Digital Media Specialist
at Independent digital marketing solutions company. (SB1)
- Set up and run large scale PPC campaigns
- Optimize performance to increase reach, generate leads and transactions and reduce cost. (multiple optimizations in a day - if needed)
- Analyze trends, along with qualitative and quantitative data to recommend campaign changes and updates.
- Track and analyze website traffic flow and provide regular internal reports.
- Create performance reports with recommendations for improvements.
- Intra-team coordination to understand how the campaigns can yield more quality leads
- Certification is Google AdWords and Analytics is preferred
Experience:1-3 years of experience in;
- Planning, implementing, and reporting the performance of PPC campaigns for multiple clients
- Keyword research and analysis
- Media Planning
- Budget optimization
Platforms:
- Google Marketing Platform
- Specifically
- Google Adwords (Display, Mobile App, Video) and Analytics
- Bing Editor
Qualities
- Constant monitoring of the performance of multiple campaigns
- Attention to small details
- Updated on the latest trends on the above-mentioned platforms
- General Knowledge:
- Understanding of PPC Campaigns
- Understanding of advertising on Google, Facebook, Linkedin, and Twitter

Similar jobs

Position: GM- Plant and operation (Building material)
Experience: 12+ year in production and plant operation in Building material industry
Location: Kapadvanj, Gujarat
salary: Negotiable
candidate must have experience in production and plant operation in building material industry. (prefer aac block, wall panel)
good communication skill presentable, smart
mail updated resume with salary details-
email; etalenthire[at]gmail[dot]com
satish: 88O 27 49 743
Location: Pune
Required Skills : Scala, Python, Data Engineering, AWS, Cassandra/AstraDB, Athena, EMR, Spark/Snowflake
Job Description – International Customer Support (US Process)
Company: Accenture
Position Type: Contract-to-Hire (C2H)
Location: Bangalore (Work from Office)
Shifts: US Shifts (5 PM IST – 7 AM IST, rotational week-offs)
Experience: Experienced (7months to 3 years)
Compensation
- Experienced: ₹28,000 – ₹40,000 CTC (based on level & experience)
About the Role
Accenture is seeking enthusiastic and customer-focused professionals to join our International Customer Support Team. This role involves interacting with US-based clients, ensuring exceptional customer service, and meeting performance benchmarks. Candidates must be flexible to work in night shifts and clear multiple screening rounds (including Versant – minimum score 58).
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Job Description
Joining: Immediate
ReDesyn is a merchandise dropshipping company that lets creators launch their
Merchandise at zero upfront cost. Our mission is to let influencers, apps, brands &
Creators monetize their reach using merchandise enabling digital tools.
About Role | Operations Head/Supply Chain Manager
• Managing Supply Chain Operations for a leading B2C eCommerce firm
• Collaborating and directing the activities of all functions involved in the
purchasing, planning, warehousing, and control of materials.
• Complete Understanding of end-to-end supply chain and complex modeling
of workflows for Category
• Provide specific cost reduction and waste removal opportunities within and
across categories
• Identify and articulate strategic importance of metrics as a basis for
managing tradeoffs, improving the customer experience, and making
decisions with internal and external stakeholders
• Leading projects impacting Supply Chain transformation (ordering process,
service levels, inventory, logistics, packaging, etc.)
• Work closely with vendor leadership and/or operations senior leaders on
strategies to reduce cost, lead time, and waste across the end-to-end supply
chain.
• Deliver maximum product availability in the category ensuing healthy
inventory level and optimal supply chain set-up.
• Be solution-oriented & integrate strong and clear data analysis and business
rationale into sound decision-making and problem-solving.
• Ability to manage data and coordinate growing daily order volume.
• Managing and coordinating on-demand manufacturing.
About Us
At Digilytics™, we build and deliver easy to use AI products to the secured lending and consumer industry sectors. In an ever-crowded world of clever technology solutions looking for a problem to solve, our solutions start with a keen understanding of what creates and what destroys value in our clients’ business.
Founded by Arindom Basu (Founding member of Infosys Consulting), the leadership of Digilytics™ is deeply rooted in leveraging disruptive technology to drive profitable business growth. With over 50 years of combined experience in technology-enabled change, the Digilytics™ leadership is focused on building a values-first firm that will stand the test of time.
We are currently focused on developing a product, Revel FS, to revolutionise loan origination for mortgages and secured lending. We are also developing a second product, Revel CI, focused on improving trade (secondary) sales to consumer industry clients like auto and FMCG players.
The leadership strongly believes in the ethos of enabling intelligence across the organization. Digiliytics AI is headquartered in London, with a presence across India.
Website: http://www.digilytics.ai">www.digilytics.ai
- Know about our product
- https://www.digilytics.ai/RevEL/Digilytics">Digilytics RelEL
- https://www.digilytics.ai/RevUP/">Digilytics RelUP
- What's it like working at Digilytics https://www.digilytics.ai/about-us.html">https://www.digilytics.ai/about-us.html
- Digilytics featured in Forbes: https://bit.ly/3zDQc4z">https://bit.ly/3zDQc4z
Responsibilities
- Experience with Azure services (Virtual machines, Containers, Databases, Security/Firewall, Function Apps etc)
- Hands-on experience on Kubernetes/Docker/helm.
- Deployment of java Builds & administration/configuration of Nginx/Reverse Proxy, Load balancer, Ms-SQL, Github, Disaster Recovery,
- Linux – Must have basic knowledge- User creation/deletion, ACL, LVM etc.
- CI/CD - Azure DevOps or any other automation tool like Terraform, Jenkins etc.
- Experience with SharePoint and O365 administration
- Azure/Kubernetes certification will be preferred.
- Microsoft Partnership experience is good to have.
- Excellent understanding of required technologies
- Good interpersonal skills and the ability to communicate ideas clearly at all levels
- Ability to work in unfamiliar business areas and to use your skills to create solutions
- Ability to both work in and lead a team and to deliver and accept peer review
- Flexible approach to working environment and hours to meet the needs of the business and clients
Must Haves:
- Hands-on experience on Kubernetes/Docker/helm.
- Experience on Azure/Aws or any other cloud provider.
- Linux & CI/CD tools knowledge.
Experience & Education:
- A start up mindset with proven experience working in both smaller and larger organizations having multicultural exposure
- Between 4-9 years of experience working closely with the relevant technologies, and developing world-class software and solutions
- Domain and industry experience by serving customers in one or more of these industries - Financial Services, Professional Services, other Retail Consumer Services
- A bachelor's degree, or equivalent, in Software Engineering and Computer Science
About this Role:
As part of the frontend development team, you will be responsible to build and maintain client side applications for our users collaborating with cross-functional teams comprising Product, Design, BI, and other engineers.
You will define best practices for client side architecture and build for the long term over iterations that bring measurable business value.
You will be involved in recruiting engineers for the team and mentoring them
As a company, we are very data driven and customer focused. As an engineering team, we are driven by metrics and care deeply about agility without compromising on the quality of our output. We are working towards creating an environment where individuals feel empowered to take ownership and initiative.
About You:
● You have a minimum of 7 years of experience building high-performance consumer-facing mobile applications at product companies of a decent scale
● You have a keen eye for mobile architecture and able to assist your team in making the right choices for every project
● You have previous experience building react native applications from scratch
● You have prior experience with recruiting and building a high-performance team
● You have a passion for mentoring and helping people on your team grow and achieve their goals
● You practice test-driven development
● You are familiar with both Android and iOS design patterns, and GraphQL
● You have some exposure to native app development in Swift, Kotlin, or Java
● You have strong knowledge of software development fundamentals, including a relevant background in computer science fundamentals and agile development methodologies.
● You are an excellent collaborator & communicator. You know that startups are a team sport. You listen to others, aren’t afraid to speak your mind and always try to ask the right questions.
● You are excited by the prospect of working in a distributed team and company.
Location: We are primarily looking for candidates in Bangalore but are open to other locations in India for the right candidate. At the moment, however, like most teams, we are fully remote.








-(1).png&w=256&q=75)