Datametica is looking for talented SQL engineers who would get training & the opportunity to work on Cloud and Big Data Analytics.
Mandatory Skills:
- Strong in SQL development
- Hands-on at least one scripting language - preferably shell scripting
- Development experience in Data warehouse projects
Opportunities:
- Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume, and KafkaWould get a chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing
About DataMetica
Similar jobs
● Able to contribute to the gathering of functional requirements, developing technical
specifications, and test case planning
● Demonstrating technical expertise, and solving challenging programming and design
problems
● 60% hands-on coding with architecture ownership of one or more products
● Ability to articulate architectural and design options, and educate development teams and
business users
● Resolve defects/bugs during QA testing, pre-production, production, and post-release
patches
● Mentor and guide team members
● Work cross-functionally with various bidgely teams including product management, QA/QE,
various product lines, and/or business units to drive forward results
Requirements
● BS/MS in computer science or equivalent work experience
● 8-12 years’ experience designing and developing applications in Data Engineering
● Hands-on experience with Big data EcoSystems.
● Past experience with Hadoop,Hdfs,Map Reduce,YARN,AWS Cloud, EMR, S3, Spark, Cassandra,
Kafka, Zookeeper
● Expertise with any of the following Object-Oriented Languages (OOD): Java/J2EE,Scala,
Python
● Ability to lead and mentor technical team members
● Expertise with the entire Software Development Life Cycle (SDLC)
● Excellent communication skills: Demonstrated ability to explain complex technical issues to
both technical and non-technical audiences
● Expertise in the Software design/architecture process
● Expertise with unit testing & Test-Driven Development (TDD)
● Business Acumen - strategic thinking & strategy development
● Experience on Cloud or AWS is preferable
● Have a good understanding and ability to develop software, prototypes, or proofs of
concepts (POC's) for various Data Engineering requirements.
● Experience with Agile Development, SCRUM, or Extreme Programming methodologies
We are looking out for a technically driven "Full-Stack Engineer" for one of our premium client
COMPANY DESCRIPTION:
Qualifications
• Bachelor's degree in computer science or related field; Master's degree is a plus
• 3+ years of relevant work experience
• Meaningful experience with at least two of the following technologies: Python, Scala, Java
• Strong proven experience on distributed processing frameworks (Spark, Hadoop, EMR) and SQL is very
much expected
• Commercial client-facing project experience is helpful, including working in close-knit teams
• Ability to work across structured, semi-structured, and unstructured data, extracting information and
identifying linkages across disparate data sets
• Confirmed ability in clearly communicating complex solutions
• Understandings on Information Security principles to ensure compliant handling and management of
client data
• Experience and interest in Cloud platforms such as: AWS, Azure, Google Platform or Databricks
• Extraordinary attention to detail
Our client is an innovative Fintech company that is revolutionizing the business of short term finance. The company is an online lending startup that is driven by an app-enabled technology platform to solve the funding challenges of SMEs by offering quick-turnaround, paperless business loans without collateral. It counts over 2 million small businesses across 18 cities and towns as its customers.
- Performing extensive analysis on SQL, Google Analytics & Excel from a product standpoint to provide quick recommendations to the management
- Establishing scalable, efficient and automated processes to deploy data analytics on large data sets across platforms
What you need to have:
- B.Tech /B.E.; Any Graduation
- Strong background in statistical concepts & calculations to perform analysis/ modeling
- Proficient in SQL and other BI tools like Tableau, Power BI etc.
- Good knowledge of Google Analytics and any other web analytics platforms (preferred)
- Strong analytical and problem solving skills to analyze large quantum of datasets
- Ability to work independently and bring innovative solutions to the team
- Experience of working with a start-up or a product organization (preferred)
In this role, you will:
As part of a team focused on the preserving the customer experience across the organization, this Analytic Consultant will be responsible for:
- Understand business objectives and provide credible challenge to analysis requirements.
- Verify sound analysis practices and data decisions were leveraged throughout planning and data sourcing phases.
- Conduct in-depth research within complex data environments to identify data integrity issues and propose solutions to improve analysis accuracy.
- Applying critical evaluation to challenge assumptions, formulate a defendable hypothesis, and ensuring high quality analysis results.
- Ensure adherence to data management/ data governance regulations and policies.
- Performing and testing highly complex data analytics for customer remediation.
- Designing analysis projects flow and documentation that is structured for consistency, easy to understand, and to be offered to multiple levels of reviewers, partners, and regulatory agents demonstrating research and analysis completed.
- Investigate and ensure data integrity from multiple sources.
- Ensure data recommended and used is the best “source of truth”.
- Applies knowledge of business, customers, and products to synthesize data to 'form a story' and align information to contrast/compare to industry perspective. Data involved typically very large, structured or unstructured, and from multiple sources.
- Must have a strong attention to detail and be able to meet high quality standards consistently.
- Other duties as assigned by manager.
- Willing to assist on high priority work outside of regular business hours or weekend as needed.
Essential Qualifications:
- Around 5+ years in similar analytics roles
- Bachelors, M.A./M.Sc. College Degree or Higher in applied mathematics, statistics, engineering, physics, accounting, finance, economics, econometrics, computer sciences, or business/social and behavioral sciences with a quantitative emphasis.
- Preferred programming knowledge SQL/SAS
- Knowledge of PVSI, Non-Lending, Student Loans, Small Business and Personal Lines and Loans is a plus.
- Strong experience with data integration, database structures and data warehouses.
- Persuasive written and verbal communication skills.
Desired Qualifications:
- Certifications in Data Science, or BI Reporting tools.
- Ability to prioritize work, meet deadlines, achieve goals and work under pressure in a dynamic and complex environment – Soft Skills.
- Detail oriented, results driven, and has the ability to navigate in a quickly changing and high demand environment while balancing multiple priorities.
- Ability to research and report on a variety of issues using problem solving skills.
- Ability to act with integrity and a high level of professionalism with all levels of team members and management.
- Ability to make timely and independent judgment decisions while working in a fast-paced and results-driven environment.
- Ability to learn the business aspects quickly, multitask and prioritize between projects.
- Exhibits appropriate sense of urgency in managing responsibilities.
- Ability to accurately process high volumes of work within established deadlines.
- Available to flex schedule periodically based on business need.
- Demonstrate strong negotiation, communication & presentation skills.
- Demonstrates a high degree of reliability, integrity and trustworthiness.
- Takes ownership of assignments and helps drive assignments of the team.
- Dedicated, enthusiastic, driven and performance-oriented; possesses a strong work ethic and good team player.
- Be proactive and get engaged in organizational initiatives.
Technical/Core skills
- Minimum 3 yrs of exp in Informatica Big data Developer(BDM) in Hadoop environment.
- Have knowledge of informatica Power exchange (PWX).
- Minimum 3 yrs of exp in big data querying tool like Hive and Impala.
- Ability to designing/development of complex mappings using informatica Big data Developer.
- Create and manage Informatica power exchange and CDC real time implementation
- Strong Unix knowledge skills for writing shell scripts and troubleshoot of existing scripts.
- Good knowledge of big data platforms and its framework.
- Good to have an experience in cloudera data platform (CDP)
- Experience with building stream processing systems using Kafka and spark
- Excellent SQL knowledge
Soft skills :
- Ability to work independently
- Strong analytical and problem solving skills
- Attitude of learning new technology
- Regular interaction with vendors, partners and stakeholders
• Responsible for developing and maintaining applications with PySpark
Must Have Skills:
If you are an outstanding ETL Developer with a passion for technology and looking forward to being part of a great development organization, we would love to hear from you. We are offering technology consultancy services to our Fortune 500 customers with a primary focus on digital technologies. Our customers are looking for top-tier talents in the industry and willing to compensate based on your skill and expertise. The nature of our engagement is Contract in most cases. If you are looking for the next big step in your career, we are glad to partner with you.
Below is the job description for your review.
Extensive hands- on experience in designing and developing ETL packages using SSIS
Extensive experience in performance tuning of SSIS packages
In- depth knowledge of data warehousing concepts and ETL systems, relational databases like SQL Server 2012/ 2014.
JD:
Required Skills:
- Intermediate to Expert level hands-on programming using one of programming language- Java or Python or Pyspark or Scala.
- Strong practical knowledge of SQL.
Hands on experience on Spark/SparkSQL - Data Structure and Algorithms
- Hands-on experience as an individual contributor in Design, Development, Testing and Deployment of Big Data technologies based applications
- Experience in Big Data application tools, such as Hadoop, MapReduce, Spark, etc
- Experience on NoSQL Databases like HBase, etc
- Experience with Linux OS environment (Shell script, AWK, SED)
- Intermediate RDBMS skill, able to write SQL query with complex relation on top of big RDMS (100+ table)