Good experience in the Extraction, Transformation, and Loading (ETL) of data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager,
Designer, Workflow Manager, Workflow Monitor, Metadata Manager), Power Connect as ETL tool on Oracle, and SQL Server Databases.
Knowledge of Data Warehouse/Data mart, ODS, OLTP, and OLAP implementations teamed with
project scope, Analysis, requirements gathering, data modeling, ETL Design, development,
System testing, Implementation, and production support.
Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts
and Dimensions
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy,
Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
Developed mapping parameters and variables to support SQL override.
Created applets to use them in different mappings.
Created sessions, configured workflows to extract data from various sources, transformed data,
and loading into the data warehouse.
Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
Modified existing mappings for enhancements of new business requirements.
Involved in Performance tuning at source, target, mappings, sessions, and system levels.
Prepared migration document to move the mappings from development to testing and then to
production repositories
Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex
SQL queries using PL/SQL.
Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica
/Talend sessions as well as performance tuning of mappings and sessions.
Experience in all phases of Data warehouse development from requirements gathering for the
data warehouse to develop the code, Unit Testing, and Documenting.
Extensive experience in writing UNIX shell scripts and automation of the ETL processes using
UNIX shell scripting.
Experience in using Automation Scheduling tools like Control-M.
Hands-on experience across all stages of Software Development Life Cycle (SDLC) including
business requirement analysis, data mapping, build, unit testing, systems integration, and user
acceptance testing.
Build, operate, monitor, and troubleshoot Hadoop infrastructure.
Develop tools and libraries, and maintain processes for other engineers to access data and write
MapReduce programs.
About globe teleservices
About
Similar jobs
The Client is the world’s largest media investment company. Our team of experts support clients in programmatic, social, paid search, analytics, technology, organic search, affiliate marketing, e-commerce and across traditional channels.
Responsibilities of the role:
· Manage extraction of data sets from multiple marketing/database platforms and perform hygiene and quality control steps, either via Datorama or in partnership with Neo Technology team. Data sources will include: web analytic tools, media analytics, customer databases, social listening tools, search tools, syndicated data, research & survey tools, etc…
· Implement and manage data system architecture
· Audit and manage data taxonomy/classifications from multiple systems and partners
· Manage the extraction, loading, and transformation of multiple data sets
· Cleanse all data and metrics; perform override updates where necessary
· Execute all business rules according to requirements · Identify and implement opportunities for efficiency throughout the process
· Manage and execute thorough QA process and ensure quality and accuracy of all data and reporting deliverables
· Manipulate and analyze “big” data sets synthesized from a variety of sources, including media platforms and marketing automation tools
· Generate and manage all data visualizations and ensure data is presented accurately and is visually pleasing
· Assist analytics team in running numerous insights reports as needed
· Help maintain a performance platform and provide insights and ongoing recommendations around.
Requirements :
· 5+ years’ experience in an analytics position working with large amounts of data
· Hands-on experience working with data visualization tools such as Datorama, Tableau, or PowerBI
· Additional desirable skills include tag management experience, application coding experience, statistics background
· Digital media experience background preferred, including knowledge of Doubleclick and web analytics tools
· Excellent communication skills
· Experience with HTML, CSS, Javascript a plus
● Research and develop advanced statistical and machine learning models for
analysis of large-scale, high-dimensional data.
● Dig deeper into data, understand characteristics of data, evaluate alternate
models and validate hypothesis through theoretical and empirical approaches.
● Productize proven or working models into production quality code.
● Collaborate with product management, marketing and engineering teams in
Business Units to elicit & understand their requirements & challenges and
develop potential solutions
● Stay current with latest research and technology ideas; share knowledge by
clearly articulating results and ideas to key decision makers.
● File patents for innovative solutions that add to company's IP portfolio
Requirements
● 4 to 6 years of strong experience in data mining, machine learning and
statistical analysis.
● BS/MS/PhD in Computer Science, Statistics, Applied Math, or related areas
from Premier institutes (only IITs / IISc / BITS / Top NITs or top US university
should apply)
● Experience in productizing models to code in a fast-paced start-up
environment.
● Expertise in Python programming language and fluency in analytical tools
such as Matlab, R, Weka etc.
● Strong intuition for data and Keen aptitude on large scale data analysis
● Strong communication and collaboration skills.
Must Have Skills : Core Java ,Microservices , Angular8, Spring boot framework
Minimum 4-8 years of experience in design, development, and deployment of JAVA/J2EE-based applications, writing PL/SQL queries/store procedures.
Strong hands-on experience in design, development, and deployment of J2EE based application database design, PL/SQL complex queries, and Store procedures XML, HTML, JavaScript, AJAX
- Expertise on XML, HTML, JavaScript, AJAX
- Proficient in handling and writing complex PL/SQL queries and Stored procedures
- preferable work experience on JBOSS as an application server or any other J2EE application server
- Should have sound knowledge in product customization, implementation, and maintenance aspects
- Foresighted & good judgment in problem-solving.
- Capable of paying attention to detail.
- Good analytical and logical thinking.
- Prioritizing and organizing
- Team Player with a positive attitude.
- Process knowledge/Technical expertise
- Good Written and Verbal communication Skills
Minimum Qualifications
- Excellent problem-solving skills and the got right attitude to work in fast paced environments
- Bachelor’s degree in computer science or equivalent practical experience
- 2 to 5 years of experience in software development using modern frontend frameworks in JavaScript/TypeScript.
- Strong understanding of data structures and algorithms.
Preferred Qualifications
- Strong in Object Oriented Programming and Design Patterns.
- You have experience of working closely with the product and design teams to deliver the product that materially impacts the business and improves the customer experience
- You follow SOLID principles and have experience with microservice architecture, have designed and implemented high performance scalable services/APIs.
- You have experience with component based architectures, PWA, Service workers, UI patterns and libraries most preferably ReactJS / NextJS
- Write client side high performant applications & develop prototypes
- Experience working with Node, NestJS / Express.
- Experience working with PostgreSQL, Redshift, Dynamo, Mongo and Cassandra databases.
- Experience working with RabbitMQ, Kafka.
- You constantly learn and adopt best practices at work, keeping in mind app performance, security, and scalability.
- You have experience working in distributed systems and built/designed systems to failover, event streaming, caching strategies
- You have experience with Docker/Kubernetes in AWS or any cloud computing platforms. And, are familiar with CI/CD processes.
Bonus if you are great communication and team collaboration skills.
Skills:
- Expertise in Python 3 on AWS serverless
- Experience in AWS serverless stack Appsync, Lambda, Cognito, API Gateway, DynamoDB, Elasticsearch, SQS, S3, Code commit & Code Deploy.
- Proficient in modern microservice-based architectures and methodologies.
- Experience in a database technology, preferably No-SQL such as AWS DynamoDB.
- Build human-centric UX with us using technologies like React, TypeScript, GraphQL, and CSS-in-JS.
- Experience building data processing pipelines (SQS, Kinesis, DynamoDB, AWS Lambda, or similar)
- Deep technical hands-on experience developing in REST/JSON or SOAP/XML, combined with strong knowledge of concepts such as CORs (Cross Origins Resources), headers, security, JSON and http concept.
- Experience with Github and advanced Github features (good to have).
- Must have worked as a part of Agile Teams and DevOps practices such as continuous Integration tools (e.g. Jenkins), code repository, creating CI/CD pipelines is required.
6 days working -Remote working would do
JOB DES RIPTION
Founded by experienced founders and funded by Tier-1 VCs, It's a solution for democratizing the shopping experience on e-commerce platforms. Our aim is to provide a superior shopping experience for all our partners and improve both customer satisfaction and their GMV.Being an early-stage company, we are looking for self-driven, motivated people who want to build something exciting and are always looking out for the next big thing. We plan to build this company remotely, which brings freedom but also an added sense of responsibility. If all this sounds interesting to you read on
Responsibilities
- Writing testable and efficient code
- Design and implementation of low-latency, high-availability, and performant applications
- Implementation of security and data protection
- implementing business logic and developing APIs and services
- Build reusable code and libraries for future use.
Skills And Qualifications
- 2-3 years of hands-on experience in back-end development with Node.js.
- Knowledge of Node.js frameworks such Resitfy
- Good understanding of server-side templating languages
- Basic understanding of front-end technologies, such as HTML5, and CSS3
- Expertise with Linux based systems
- Proficient understanding of code versioning tools, such as Git
- Have worked in any of the cloud based platform AWS, GCP, Docker, Kubernetes.
Traits we value
- Independent, resourceful, analytical, and able to solve problems effectively
- Ability to be flexible, agile, and thrive in chaos
- Excellent oral and written communication skills
Job brief
Android Developer who possesses a passion for pushing mobile technologies to the limits. Android app developer will work with our team of talented engineers to design and build the next generation of our mobile applications for our Banking Products. Android programming works closely with other app development and technical teams.
Responsibilities
- Design and build advanced applications for the Android platform
- Collaborate with cross-functional teams to define, design, and ship new features
- Work with outside data sources and APIs
- Unit-test code for robustness, including edge cases, usability, and general reliability
- Work on bug fixing and improving application performance
- Continuously discover, evaluate, and implement new technologies to maximize development efficiency
Requirements
- Proven software development experience and Android skills development
- Proven working experience in Android app development and
- Have published at least one original Android app
- Experience with Android SDK
- Experience working with remote data via REST and JSON
- Experience with third-party libraries and APIs
- Working knowledge of the general mobile landscape, architectures, trends, and emerging technologies
- Solid understanding of the full mobile development life cycle.
About the Company
Blue Sky Analytics is a Climate Tech startup that combines the power of AI & Satellite data to aid in the creation of a global environmental data stack. Our funders include Beenext and Rainmatter. Over the next 12 months, we aim to expand to 10 environmental data-sets spanning water, land, heat, and more!
We are looking for a data scientist to join its growing team. This position will require you to think and act on the geospatial architecture and data needs (specifically geospatial data) of the company. This position is strategic and will also require you to collaborate closely with data engineers, data scientists, software developers and even colleagues from other business functions. Come save the planet with us!
Your Role
Manage: It goes without saying that you will be handling large amounts of image and location datasets. You will develop dataframes and automated pipelines of data from multiple sources. You are expected to know how to visualize them and use machine learning algorithms to be able to make predictions. You will be working across teams to get the job done.
Analyze: You will curate and analyze vast amounts of geospatial datasets like satellite imagery, elevation data, meteorological datasets, openstreetmaps, demographic data, socio-econometric data and topography to extract useful insights about the events happening on our planet.
Develop: You will be required to develop processes and tools to monitor and analyze data and its accuracy. You will develop innovative algorithms which will be useful in tracking global environmental problems like depleting water levels, illegal tree logging, and even tracking of oil-spills.
Demonstrate: A familiarity with working in geospatial libraries such as GDAL/Rasterio for reading/writing of data, and use of QGIS in making visualizations. This will also extend to using advanced statistical techniques and applying concepts like regression, properties of distribution, and conduct other statistical tests.
Produce: With all the hard work being put into data creation and management, it has to be used! You will be able to produce maps showing (but not limited to) spatial distribution of various kinds of data, including emission statistics and pollution hotspots. In addition, you will produce reports that contain maps, visualizations and other resources developed over the course of managing these datasets.
Requirements
These are must have skill-sets that we are looking for:
- Excellent coding skills in Python (including deep familiarity with NumPy, SciPy, pandas).
- Significant experience with git, GitHub, SQL, AWS (S3 and EC2).
- Worked on GIS and is familiar with geospatial libraries such as GDAL and rasterio to read/write the data, a GIS software such as QGIS for visualisation and query, and basic machine learning algorithms to make predictions.
- Demonstrable experience implementing efficient neural network models and deploying them in a production environment.
- Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications.
- Capable of writing clear and lucid reports and demystifying data for the rest of us.
- Be curious and care about the planet!
- Minimum 2 years of demonstrable industry experience working with large and noisy datasets.
Benefits
- Work from anywhere: Work by the beach or from the mountains.
- Open source at heart: We are building a community where you can use, contribute and collaborate on.
- Own a slice of the pie: Possibility of becoming an owner by investing in ESOPs.
- Flexible timings: Fit your work around your lifestyle.
- Comprehensive health cover: Health cover for you and your dependents to keep you tension free.
- Work Machine of choice: Buy a device and own it after completing a year at BSA.
- Quarterly Retreats: Yes there's work-but then there's all the non-work+fun aspect aka the retreat!
- Yearly vacations: Take time off to rest and get ready for the next big assignment by availing the paid leaves.
Dear Candidate,
Hope you are doing great!!
As per our telephonic discussion, I have attached JD.
Company Details :
Company Name: i-Lanam Technologies
i-Lanam was established in the year 2020 with an insight to provide a
competitive approach towards application development to deliver the best quality applications at the cheapest of the prices. Looking at the need to have a company that can understand the ownership of the application output some quality developers came together to establish this company
Canada Office
182 Central Ave, 3, London, Ontario N6A 1M7, CA
Ahmedabad Location
417 SHIVALIK SHILP Iscon Cross Road, 4th Floor, Ahmedabad,Gujarat 380015, IN
HTML Developer Responsibilities:
- Meeting with Web designers to discuss project design and layout.
- Coding the entire HTML site from end to end.
- Debugging code and front-end web applications.
- Ensuring cross-platform compatibility.
- Troubleshooting application errors.
- Conducting website performance and usability tests.
- Meeting publication deadlines.
- Providing user support.
HTML Developer Requirements:
- Bachelor's degree in computer science, computer engineering, MIS, or similar.
- At least 1 years' experience as an HTML Developer.
- In-depth knowledge of front-end coding languages including HTML, CSS, JavaScript, and XML.
- Ability to troubleshoot coding and application errors.
- Knowledge of web design and user application requirements.
- Ability to meet strict publication deadlines.
- Excellent communication and interpersonal skills.
- Strong attention to details