Responsibilities
- Lead consultative engagements with clients, exploring their technical and business challenges. This will include delivering workshops, presenting solution overviews and providing benefits analysis of software features and developments
- Provide consulting services to aid the implementation, adoption, operation and expansion of Adobe Campaign at client organisations
- Provide Implementation consultancy, Customer Assistance and post-implementation support to Adobe Campaign clients in their deployment and use of Adobe Campaign solution. Specifically (but not limited to):
o Work with Clients to assess requirements
o Configure & Design Adobe Campaign to meet customer business needs by understanding their database architecture and setting-up ETL and data-flows
o Configure the Adobe Campaign solution to customers’ requirement including Campaigns set-ups, building web pages using Scripting
o Develop customer specific solutions where necessary and document them
o Proactive problem diagnosis, troubleshooting and resolution
- Create Implementation plans for new and existing clients
- Creation of accurate estimates and quotations for Adobe Campaign Consulting services
- Creation and maintenance of relevant documentation including requirements, functional and technical specs that define and support the solution
- Ownership of the technology solution from estimation and design through to delivery, acceptance and operation
- Knowledge transfer/Training to customers on solution implemented
- Product Upgrades and Configuration activities
- Reporting on project delivery to Project Management and Manage projects on your own in smaller projects which don’t have dedicated Project Managers
- Build great working relationship with Onshore Consulting team, Learn and Adopt best-practices from such teams globally
- Work closely with the Quality Assurance team to ensure professional delivery of technical implementations
About Pattem Digital Technologies
Similar jobs
Kindly note that candidates who have graduated in 2022 and 2023 only will be considered for the role who are based in Mumbai, immediate joiners
JD - Data Operations Analyst
What is the job and team like
- As a Data Operations Analyst we manage business reporting of numerous teams, constantly monitor performance
- Checking the integrity of the revenue reporting done by the different systems for the correct profitability to be
- reported to the CXOs
- Send reports periodically and alert stakeholders for changes in the key performance metrics
- Allocate efforts to different Business Implementations that help build Profit/Loss statement for the Financials.
- Track crucial data points which affect the core of the business and escalate it to senior stakeholders
Roles and Responsibilities
- Graduate in IT Background(BE/BSc IT/ BCA) 2022 and 2023 graduates only
- Executing a set of business processes daily/weekly/monthly as per Business requirement.
- Provide ad-hoc data support on any urgent reports and material in an expedited manner
- Maintain a list of open tasks and escalations, and send updates to the relevant stakeholders
- Have an eye for detail, should have the ability to look at numbers, spot trends and identify gaps
- Identify efficient and meaningful ways to communicate data and analysis through ongoing reports and dashboards
- Proficiency in SQL, Excel and any statistical and analytical tools such as SAS, SPSS is a big plus
- Managing master data, including creation, updates, and deletion.
- Ability to work in a fast paced, technical, cross functional environment
- Familiarity with Internet Industry and Online Advertising Business is a plus
Ideal candidate
- Import and export large volume of data to database tables as required
- Should be able to write Data Definition Language or Data Manipulation Language SQL commands
- Develop programs, methodologies to get analyzable data on a regular basis
- Good team player and multi-tasker
- Should have the ability to learn and adapt to change
- Self-starter Must be productive with minimal direction
- High-level written and verbal communication sk
Job Details
Work mode- In office
Must have skills - SQL, MS Excel, Communications
Purpose of Job:
We are looking for someone who can manage the daily activities of the
team responsible for the design, implementation, maintenance and
support of data warehouse systems and related data marts. Oversees
data design and the creation of database architecture.
Job Responsibilities:
7+ years of industry experience and 2+ years of experience
managing a team
Exceptional knowledge in designing modern Databases such as
MySQL, Postgres, Redshift, Snowflake, Hive or Presto. Has good experience working in agile based projects. Has experience in understanding & converting BRD’s into
technical designs. Work with management to provide effort, estimation and
timelines. Work closely with IT, Business teams and other internal
stakeholders to resolve business queries within the defined SLA. Exceptional knowledge in SQL and in designing tables, databases, partitions and query optimization. Has good experience in designing ETL solutions in tools such as
Talend/AWS Glue/EMR. Have at least 3 years of experience working on AWS Cloud
Hands-on in monitoring ETL jobs and performing health check to
ensure data quality
Have good exposure with AWS Services such as RDS, Aurora, S3, Lambda, Glue, EMR, Step Functions etc. Design quality assurance tests for ensuring data integrity and
quality
Good to have Python and big data knowledge
Qualifications:
At least a bachelor’s degree in Science, Engineering, Applied
Mathematics. Masters in Computer Science or related field in preferred. Other Requirements: Leadership skills, excellent communication skills, ability to own tasks
At Codvo, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day. We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results.
Key Responsibilities
- The Kafka Engineer is responsible for Designing and recommending the best approach suited for data movement to/from different sources using Apache/Confluent Kafka.
- Create topics, set up redundancy cluster, deploy monitoring tools, and alerts, and has good knowledge of best practices.
- Develop and ensure adherence to published system architectural decisions and development standards
- Must be comfortable working with offshore/global teams to deliver projects
Required Skills
- Good understanding of Event-based architecture, messaging frameworks and stream processing solutions using Kafka Messaging framework.
- 3+ years hands-on experience working on Kafka connect using schema registry in a high-volume environment.
- Strong knowledge and exposure to Kafka brokers, zookeepers, KSQL, KStream and Kafka Control centre.
- Good experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connectors, JMS source connectors, Tasks, Workers, converters, and Transforms.
- Hands on experience with developing APIs and Microservices
- Solid expertise with Java
- Working knowledge on Kafka Rest proxy and experience on custom connectors using the Kafka core concepts and API.
Good to have:
- Understanding of Data warehouse architecture and data modelling
- Good knowledge of big data ecosystem to design and develop capabilities to deliver solutions using CI/CD pipelines.
- Good understanding of other AWS services such as CloudWatch monitoring, scheduling, and automation services
- Strong skills in In-memory applications, Database Design, and Data Integration
- Ability to guide and mentor team members on using Kafka.
Experience: 3 to 8 Years
Work timings: 2.30PM - 11.30PM
location - hyderabad
What you will do:
- Bringing data to life via Historical and Real Time Dashboard like:
2. Transaction behavior analytics
3. User level analytics
4. Propensity models and personalization models, eg whats the best product or offer to drive a sale
5. Emails/ SMS/ AN – analytics models getting data from platforms like Netcore+ Branch+ Internal Tables
6. Other reports that are relevant to know what is working, gaps, etc
- Monitoring key metrics such as commission, gross margin, conversion, customer acquisitions etc
- Using the data models and reports to draw actionable and meaningful insights. Based on the insights helping drive the strategy, optimization opportunities, product improvement and more
- Demonstrating examples where data interpretation led to improvement in core business outcomes like better conversion, better ROI from Ad spends, improvements in product etc.
- Digging into data to identify opportunities or problems and translating them into easy-to-understand way for all key business teams
- Working closely with various business stakeholders: Marketing, Customer Insights, Growth and Product teams
- Ensuring effective and timely delivery of reports and insights that analyze business functions and key operations and performance metrics
What you need to have:
- Minimum 2 years of data analytics and interpretation experience
- Proven experience to show how data analytics shaped strategy, marketing and product. Should have multiple examples of this based on current and past experience
- Strong Data Engineering skills using tools like SQL, Python, Tableau, Power BI, Advanced Excel, PowerQuery etc
- Strong marketing acumen to be able to translate data into marketing outcomes
- Good understanding of Google Analytics, Google AdWords, Facebook Ads, CleverTap and other analytics tools
- Familiarity with data sources like Branch/ Clevertap / Firebase, Big Query, Internal Tables (User/ Click/ Transaction/ Events etc)
Specialism- Advance Analytics, Data Science, regression, forecasting, analytics, SQL, R, python, decision tree, random forest, SAS, clustering classification
Senior Analytics Consultant- Responsibilities
- Understand business problem and requirements by building domain knowledge and translate to data science problem
- Conceptualize and design cutting edge data science solution to solve the data science problem, apply design thinking concepts
- Identify the right algorithms , tech stack , sample outputs required to efficiently adder the end need
- Prototype and experiment the solution to successfully demonstrate the value
Independently or with support from team execute the conceptualized solution as per plan by following project management guidelines - Present the results to internal and client stakeholder in an easy to understand manner with great story telling, story boarding, insights and visualization
- Help build overall data science capability for eClerx through support in pilots, pre sales pitches, product development , practice development initiatives
Job Description
Experience: 3+ yrs
We are looking for a MySQL DBA who will be responsible for ensuring the performance, availability, and security of clusters of MySQL instances. You will also be responsible for design of database, database architecture, orchestrating upgrades, backups, and provisioning of database instances. You will also work in tandem with the other teams, preparing documentations and specifications as required.
Responsibilities:
Database design and data architecture
Provision MySQL instances, both in clustered and non-clustered configurations
Ensure performance, security, and availability of databases
Prepare documentations and specifications
Handle common database procedures, such as upgrade, backup, recovery, migration, etc.
Profile server resource usage, optimize and tweak as necessary
Skills and Qualifications:
Proven expertise in database design and data architecture for large scale systems
Strong proficiency in MySQL database management
Decent experience with recent versions of MySQL
Understanding of MySQL's underlying storage engines, such as InnoDB and MyISAM
Experience with replication configuration in MySQL
Knowledge of de-facto standards and best practices in MySQL
Proficient in writing and optimizing SQL statements
Knowledge of MySQL features, such as its event scheduler
Ability to plan resource requirements from high level specifications
Familiarity with other SQL/NoSQL databases such as Cassandra, MongoDB, etc.
Knowledge of limitations in MySQL and their workarounds in contrast to other popular relational databases
- We are looking for an experienced data engineer to join our team.
- The preprocessing involves ETL tasks, using pyspark, AWS Glue, staging data in parquet formats on S3, and Athena
To succeed in this data engineering position, you should care about well-documented, testable code and data integrity. We have devops who can help with AWS permissions.
We would like to build up a consistent data lake with staged, ready-to-use data, and to build up various scripts that will serve as blueprints for various additional data ingestion and transforms.
If you enjoy setting up something which many others will rely on, and have the relevant ETL expertise, we’d like to work with you.
Responsibilities
- Analyze and organize raw data
- Build data pipelines
- Prepare data for predictive modeling
- Explore ways to enhance data quality and reliability
- Potentially, collaborate with data scientists to support various experiments
Requirements
- Previous experience as a data engineer with the above technologies
High Level Scope of Work :
- Work with AI / Analytics team to priorities MACHINE LEARNING Identified USE CASES for Development and Rollout
- Meet and understand current retail / Marketing Requirements and how AI/ML solution will address and automate the decision process.
- Develop AI/ML Programs using DATAIKU Solution & Python or open source tech with focus to deliver high Quality and accurate ML prediction Model
- Gather additional and external data sources to support the AI/ML Model as desired .
- Support the ML Model and FINE TUNEit to ensure high accuracy all the time.
- Example of use cases (Customer Segmentation , Product Recommendation, Price Optimization, Retail Customer Personalization Offers, Next Best Location for Business Est, CCTV Computer Vision, NLP and Voice Recognition Solutions)
Required technology expertise :
- Deep Knowledge & Understanding on MACHINE LEARNING ALGORITHMS (Supervised / Unsupervised Learning / Deep Learning Models)
- Hands on EXP for at least 5+ years with PYTHON and R STATISTICS PROGRAMMING Languages
- Strong Database Development knowledge using SQL and PL/SQL
- Must have EXP using Commercial Data Science Solution particularly DATAIKU and (Altryx, SAS, Azure ML, Google ML, Oracle ML is a plus)
- Strong hands on EXP with BIG DATA Solution Architecture and Optimization for AI/ML Workload.
- Data Analytics and BI Tools Hand on EXP particularly (Oracle OBIEE and Power BI)
- Have implemented and Developed at least 3 successful AI/ML Projects with tangible Business Outcomes In retail Focused Industry
- Have at least 5+ Years EXP in Retail Industry and Customer Focus Business.
- Ability to communicate with Business Owner & stakeholders to understand their current issues and provide MACHINE LEARNING Solution accordingly.
Qualifications
- Bachelor Degree or Master Degree in Data Science, Artificial Intelligent, Computer Science
- Certified as DATA SCIENTIST or MACHINE LEARNING Expert.