PuTTY Jobs in Mumbai
at Magic9 Media and Consumer Knowledge Pvt. Ltd.
This requirement is to service our client which is a leading big data technology company that measures what viewers consume across platforms to enable marketers make better advertising decisions. We are seeking a Senior Data Operations Analyst to mine large-scale datasets for our client. Their work will have a direct impact on driving business strategies for prominent industry leaders. Self-motivation and strong communication skills are both must-haves. Ability to work in a fast-paced work environment is desired.
Problems being solved by our client:
Measure consumer usage of devices linked to the internet and home networks including computers, mobile phones, tablets, streaming sticks, smart TVs, thermostats and other appliances. There are more screens and other connected devices in homes than ever before, yet there have been major gaps in understanding how consumers interact with this technology. Our client uses a measurement technology to unravel dynamics of consumers’ interactions with multiple devices.
Duties and responsibilities:
- The successful candidate will contribute to the development of novel audience measurement and demographic inference solutions.
- Develop, implement, and support statistical or machine learning methodologies and processes.
- Build, test new features and concepts and integrate into production process
- Participate in ongoing research and evaluation of new technologies
- Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system
- Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients
- 3-5 years relevant work experience in areas as outlined below
- Experience in extracting data using SQL from large databases
- Experience in writing complex ETL processes and frameworks for analytics and data management. Must have experience in working on ETL tools.
- Master’s degree or PhD in Statistics, Data Science, Economics, Operations Research, Computer Science, or a similar degree with a focus on statistical methods. A Bachelor’s degree in the same fields with significant, demonstrated professional research experience will also be considered.
- Programming experience in scientific computing language (R, Python, Julia) and the ability to interact with relational data (SQL, Apache Pig, SparkSQL). General purpose programming (Python, Scala, Java) and familiarity with Hadoop is a plus.
- Excellent verbal and written communication skills.
- Experience with TV or digital audience measurement or market research data is a plus.
- Familiarity with systems analysis or systems thinking is a plus.
- Must be comfortable with analyzing complex, high-volume and high-dimension data from varying sources
- Excellent verbal, written and computer communication skills
- Ability to engage with Senior Leaders across all functional departments
- Ability to take on new responsibilities and adapt to changes
This profile will include the following responsibilities:
- Develop Parsers for XML and JSON Data sources/feeds
- Write Automation Scripts for product development
- Build API Integrations for 3rd Party product integration
- Perform Data Analysis
- Research on Machine learning algorithms
- Understand AWS cloud architecture and work with 3 party vendors for deployments- Resolve issues in AWS environment
We are looking for candidates with:
Programming Language: Python
Web Development: Basic understanding of Web Development. Working knowledge of Python Flask is desirable
Database & Platform: AWS/Docker/MySQL/MongoDB
Basic Understanding of Machine Learning Models & AWS Fundamentals is recommended.
Create data funnels to feed into models via web, structured and unstructured data
Maintain coding standards using SDLC, Git, AWS deployments etc
Keep abreast of developments in the field
Deploy models in production and monitor them
Documentations of processes and logic
Take ownership of the solution from code to deployment and performance
- Minimum 1 years of relevant experience, in PySpark (mandatory)
- Hands on experience in development, test, deploy, maintain and improving data integration pipeline in AWS cloud environment is added plus
- Ability to play lead role and independently manage 3-5 member of Pyspark development team
- EMR ,Python and PYspark mandate.
- Knowledge and awareness working with AWS Cloud technologies like Apache Spark, , Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS
Mandatory Skills required:
1. Ability to translate business requirements into technical requirements for QlikView
2. Perform detailed analysis of source systems and source system data and model that data in
3. Design, develop, and test QlikView scripts to import data from source systems, data feeds, flat files
to create Qlik marts
4. Proficiency with QlikView Scripting, use of complex QlikView functions,
advanced QlikView Expressions, experience with complex data models and optimization of data
model for query performance to create Qlik Marts
5. Architecture optimization ( includes hardware sizing , security setup , performance tuning )
6. Development of Qlikview/ QlikSense
A manufacturing giant headquartered in Mumbai.
SAP B1 Developer
- Degree or comparable in the computer science area Experience with ERP systems, ideally SAP.
- Experience with object-oriented development in Java and/ or C# and/ or Vb.net.
- Experience with web services (REST, SOAP).
- Database Application Knowledge (SQL)
- Knowledge in handling of data formats (JSON, XML)
- Experience with version control systems (GIT, TFVC)
Experience level: Minimum 10 years
This role would be leading the DBA teams of multiple experience level DBAs for a mix of – Teradata, Oracle and SQL.
Minimum 10 years of relevant Database and Datawarehouse experience.
Hands on experience of administrating Teradata.
Leading the performance analysis, capacity planning and supporting the batchops and users with their jobs.
Drive implementation of standards and best practices to optimize database utilization and availability.
Hands on with AWS Cloud infrastructure services such as EC2, S3 and network services.
Proficient in Linux system administration relevant to Teradata management.
Teradata Specific (Mandatory)
Manage and Operate 24x7 production as well as development databases to ensure maximum availability of system resources.
Responsible for operational activities of a Database Administrator such as System monitoring, User Management, Space Management, Troubleshooting, and Batch/user support.
Perform DBA related tasks in key areas of Performance Management & Reporting, workload management using TASM.
Manage Production/Development databases in areas like Capacity Planning, Performance Monitoring & Tuning, Strategies Defined for Backup/Recovery Techniques, Space/ User/ Security management along With Problem determination and resolution.
Experience with Teradata Workload management & monitoring and query optimization.
Expertise with system monitoring using viewpoint and logs.
Proficient in analysing the performance and optimizing at different levels.
Ability to create advanced system-level capacity reports as well as root cause analysis.
Oracle Specific (Optional)
Database Administration Installation of Oracle software on Unix/Linux platform.
Database Lifecycle Management - Database creation, setup decommissioning.
Database event alert monitoring, space management, user management.
Database upgrades migrations, cloning.
Database backup restore recovery using RMAN.
Setup and maintain High-Availability and Disaster Recovery solutions.
Proficient in Standby and Data Guard technology.
Hands on with the OEM CC.
- Teradata Vantage Certified Administrator
at Accion Labs
A Solutions Architect is responsible for validating the logical models, ensuring standards, driving consolidation of redundant data, and enforcing the strategic vision through data models. The Architect role has an in-depth understanding of both our business capabilities and how it aligns to our enterprise data models. Partners with Enterprise Architecture to consult on and develop domain models. Consults with project teams and functional units on the design of important projects or services. consults with business leadership on the design of systems and projects. May consult with leadership on emerging technologies.
To be successful as a solution architect, you should be able to integrate any updated specifications and requirements into the systems architecture. An outstanding solution architect should be able to explain complex problems to management in layman’s terms.
Building and integrating information systems to meet the company’s needs.
Assessing the systems architecture currently in place and working with technical staff to recommend solutions to improve it.
E2E accountability of solution design across multiple products, integrations and technologies that deliver successful business outcomes which meet reliability, availability, serviceability needs.
Experience working with the latest emerging tehnolgies and Programming Languages like - Java, .Net, MERN Stack, MEAN Stack, Angular, React, VueJS, NodeJS, Block Chain, GoLang, ML, Data Science related areas, etc
Provide detailed specifications for proposed solutions
Resolving technical problems as they arise.
Providing supervision and guidance to development teams.
Continually researching current and emerging technologies and proposing changes where needed.
Informing various stakeholders about any problems with the current technical solutions being implemented.
Assessing the business impact that certain technical choices have.
Providing updates to stakeholders on product development processes, costs, and budgets.
Work closely with Information Technology professionals within the company to ensure hardware is available for projects and working properly
Propose and establish framework for necessary contributions from various departments
Account for possible project challenges on constraints including, risks, time, resources, and scope
We are looking for an engineer with ML/DL background.
Ideal candidate should have the following skillset
3) Experience building and deploying systems
4) Experience with Theano/Torch/Caffe/Keras all useful
5) Experience Data warehousing/storage/management would be a plus
6) Experience writing production software would be a plus
7) Ideal candidate should have developed their own DL architechtures apart from using open source architechtures.
8) Ideal candidate would have extensive experience with computer vision applications
Candidates would be responsible for building Deep Learning models to solve specific problems. Workflow would look as follows:
1) Define Problem Statement (input -> output)
2) Preprocess Data
3) Build DL model
4) Test on different datasets using Transfer Learning
5) Parameter Tuning
6) Deployment to production
Candidate should have experience working on Deep Learning with an engineering degree from a top tier institute (preferably IIT/BITS or equivalent)