We are looking for a skilled Senior/Lead Bigdata Engineer to join our team. The role is part of the research and development team, where you with enthusiasm and knowledge are going to be our technical evangelist for the development of our inspection technology and products.
At Elop we are developing product lines for sustainable infrastructure management using our own patented technology for ultrasound scanners and combine this with other sources to see holistic overview of the concrete structure. At Elop we will provide you with world-class colleagues highly motivated to position the company as an international standard of structural health monitoring. With the right character you will be professionally challenged and developed.
This position requires travel to Norway.
Elop is sister company of Simplifai and co-located together in all geographic locations.
Roles and Responsibilities
- Define technical scope and objectives through research and participation in requirements gathering and definition of processes
- Ingest and Process data from data sources (Elop Scanner) in raw format into Big Data ecosystem
- Realtime data feed processing using Big Data ecosystem
- Design, review, implement and optimize data transformation processes in Big Data ecosystem
- Test and prototype new data integration/processing tools, techniques and methodologies
- Conversion of MATLAB code into Python/C/C++.
- Participate in overall test planning for the application integrations, functional areas and projects.
- Work with cross functional teams in an Agile/Scrum environment to ensure a quality product is delivered.
Desired Candidate Profile
- Bachelor's degree in Statistics, Computer or equivalent
- 7+ years of experience in Big Data ecosystem, especially Spark, Kafka, Hadoop, HBase.
- 7+ years of hands-on experience in Python/Scala is a must.
- Experience in architecting the big data application is needed.
- Excellent analytical and problem solving skills
- Strong understanding of data analytics and data visualization, and must be able to help development team with visualization of data.
- Experience with signal processing is plus.
- Experience in working on client server architecture is plus.
- Knowledge about database technologies like RDBMS, Graph DB, Document DB, Apache Cassandra, OpenTSDB
- Good communication skills, written and oral, in English
We can Offer
- An everyday life with exciting and challenging tasks with the development of socially beneficial solutions
- Be a part of companys research and Development team to create unique and innovative products
- Colleagues with world-class expertise, and an organization that has ambitions and is highly motivated to position the company as an international player in maintenance support and monitoring of critical infrastructure!
- Good working environment with skilled and committed colleagues an organization with short decision paths.
- Professional challenges and development
About Simplifai Cognitive Solutions Pvt Ltd
The growth of artificial intelligence accelerated these thoughts. Machine learning made it possible for the projects to get smaller, the solutions smarter, and the automation more efficient. Bård and Erik wanted to bring AI to the people, and they wanted to do it simply.
Simplifai was founded in 2017 and has grown considerably since then. Today we work globally and have offices in Norway, India, and Ukraine. We have built a global, diverse organization that is well prepared for further growth.
The Platform Data Science team works at the intersection of data science and engineering. Domain experts develop and advance platforms, including the data platforms, machine learning platform, other platforms for Forecasting, Experimentation, Anomaly Detection, Conversational AI, Underwriting of Risk, Portfolio Management, Fraud Detection & Prevention and many more. We also are the Data Science and Analytics partners for Product and provide Behavioural Science insights across Jupiter.
About the role:
We’re looking for strong Software Engineers that can combine EMR, Redshift, Hadoop, Spark, Kafka, Elastic Search, Tensorflow, Pytorch and other technologies to build the next generation Data Platform, ML Platform, Experimentation Platform. If this sounds interesting we’d love to hear from you!
This role will involve designing and developing software products that impact many areas of our business. The individual in this role will have responsibility help define requirements, create software designs, implement code to these specifications, provide thorough unit and integration testing, and support products while deployed and used by our stakeholders.
Participate, Own & Influence in architecting & designing of systems
Collaborate with other engineers, data scientists, product managers
Build intelligent systems that drive decisions
Build systems that enable us to perform experiments and iterate quickly
Build platforms that enable scientists to train, deploy and monitor models at scale
Build analytical systems that drives better decision making
Programming experience with at least one modern language such as Java, Scala including object-oriented design
Experience in contributing to the architecture and design (architecture, design patterns, reliability and scaling) of new and current systems
Bachelor’s degree in Computer Science or related field
Computer Science fundamentals in object-oriented design
Computer Science fundamentals in data structures
Computer Science fundamentals in algorithm design, problem solving, and complexity analysis
Experience in databases, analytics, big data systems or business intelligence products:
Data lake, data warehouse, ETL, ML platform
Big data tech like: Hadoop, Apache Spark
Skills: Machine Learning,Deep Learning,Artificial Intelligence,python.
Domain knowledge: Data cleaning, modelling, analytics, statistics, machine learning, AI
· To be part of Digital Manufacturing and Industrie 4.0 projects across Saint Gobain group of companies
· Design and develop AI//ML models to be deployed across SG factories
· Knowledge on Hadoop, Apache Spark, MapReduce, Scala, Python programming, SQL and NoSQL databases is required
· Should be strong in statistics, data analysis, data modelling, machine learning techniques and Neural Networks
· Prior experience in developing AI and ML models is required
· Experience with data from the Manufacturing Industry would be a plus
Roles and Responsibilities:
· Develop AI and ML models for the Manufacturing Industry with a focus on Energy, Asset Performance Optimization and Logistics
· Multitasking, good communication necessary
· Entrepreneurial attitude.
- High Skilled and proficient on Azure Data Engineering Tech stacks (ADF, Databricks)
- Should be well experienced in design and development of Big data integration platform (Kafka, Hadoop).
- Highly skilled and experienced in building medium to complex data integration pipelines for Data at Rest and streaming data using Spark.
- Strong knowledge in R/Python.
- Advanced proficiency in solution design and implementation through Azure Data Lake, SQL and NoSQL Databases.
- Strong in Data Warehousing concepts
- Expertise in SQL, SQL tuning, Data Management (Data Security), schema design, Python and ETL processes
- Highly Motivated, Self-Starter and quick learner
- Must have Good knowledge on Data modelling and understating of Data analytics
- Exposure to Statistical procedures, Experiments and Machine Learning techniques is an added advantage.
- Experience in leading small team of 6/7 Data Engineers.
- Excellent written and verbal communication skills
This requirement is to service our client which is a leading big data technology company that measures what viewers consume across platforms to enable marketers make better advertising decisions. We are seeking a Senior Data Operations Analyst to mine large-scale datasets for our client. Their work will have a direct impact on driving business strategies for prominent industry leaders. Self-motivation and strong communication skills are both must-haves. Ability to work in a fast-paced work environment is desired.
Problems being solved by our client:
Measure consumer usage of devices linked to the internet and home networks including computers, mobile phones, tablets, streaming sticks, smart TVs, thermostats and other appliances. There are more screens and other connected devices in homes than ever before, yet there have been major gaps in understanding how consumers interact with this technology. Our client uses a measurement technology to unravel dynamics of consumers’ interactions with multiple devices.
Duties and responsibilities:
- The successful candidate will contribute to the development of novel audience measurement and demographic inference solutions.
- Develop, implement, and support statistical or machine learning methodologies and processes.
- Build, test new features and concepts and integrate into production process
- Participate in ongoing research and evaluation of new technologies
- Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system
- Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients
- 3-5 years relevant work experience in areas as outlined below
- Experience in extracting data using SQL from large databases
- Experience in writing complex ETL processes and frameworks for analytics and data management. Must have experience in working on ETL tools.
- Master’s degree or PhD in Statistics, Data Science, Economics, Operations Research, Computer Science, or a similar degree with a focus on statistical methods. A Bachelor’s degree in the same fields with significant, demonstrated professional research experience will also be considered.
- Programming experience in scientific computing language (R, Python, Julia) and the ability to interact with relational data (SQL, Apache Pig, SparkSQL). General purpose programming (Python, Scala, Java) and familiarity with Hadoop is a plus.
- Excellent verbal and written communication skills.
- Experience with TV or digital audience measurement or market research data is a plus.
- Familiarity with systems analysis or systems thinking is a plus.
- Must be comfortable with analyzing complex, high-volume and high-dimension data from varying sources
- Excellent verbal, written and computer communication skills
- Ability to engage with Senior Leaders across all functional departments
- Ability to take on new responsibilities and adapt to changes
- Design and develop strong analytics system and predictive models
- Managing a team of data scientists, machine learning engineers, and big data specialists
- Identify valuable data sources and automate data collection processes
- Undertake pre-processing of structured and unstructured data
- Analyze large amounts of information to discover trends and patterns
- Build predictive models and machine-learning algorithms
- Combine models through ensemble modeling
- Present information using data visualization techniques
- Propose solutions and strategies to business challenges
- Collaborate with engineering and product development teams
- Proven experience as a seasoned Data Scientist
- Good Experience in data mining processes
- Understanding of machine learning and Knowledge of operations research is a value addition
- Strong understanding and experience in R, SQL, and Python; Knowledge base with Scala, Java, or C++ is an asset
- Experience using business intelligence tools (e. g. Tableau) and data frameworks (e. g. Hadoop)
- Strong math skills (e. g. statistics, algebra)
- Problem-solving aptitude
- Excellent communication and presentation skills
- Experience in Natural Language Processing (NLP)
- Strong competitive coding skills
- BSc/BA in Computer Science, Engineering or relevant field; graduate degree in Data Science or other quantitative field is preferred
About the role
- Collaborating with a team of like-minded and experienced engineers for Tier 1 customers, you will focus on data engineering on large complex data projects. Your work will have an impact on platforms that handle crores of customers and millions of transactions daily.
- As an engineer, you will use the latest cloud services to design and develop reusable core components and frameworks to modernise data integrations in a cloud first world and own those integrations end to end working closely with business units. You will design and build for efficiency, reliability, security and scalability. As a consultant, you will help drive a data engineering culture and advocate best practices.
- 1-6 years of relevant experience
- Strong SQL skills and data literacy
- Hands-on experience designing and developing data integrations, either in ETL tools, cloud native tools or in custom software
- Proficiency in scripting and automation (e.g. PowerShell, Bash, Python)
- Experience in an enterprise data environment
- Strong communication skills
- Ability to work on data architecture, data models, data migration, integration and pipelines
- Ability to work on data platform modernisation from on-premise to cloud-native
- Proficiency in data security best practices
- Stakeholder management experience
- Positive attitude with the flexibility and ability to adapt to an ever-changing technology landscape
- Desire to gain breadth and depth of technologies to support customer's vision and project objectives
What to expect if you join Servian?
- Learning & Development: We invest heavily in our consultants and offer internal training weekly (both technical and non-technical alike!) and abide by a ‘You Pass We Pay” policy.
- Career progression: We take a longer term view of every hire. We have a flat org structure and promote from within. Every hire is developed as a future leader and client adviser.
- Variety of projects: As a consultant, you will have the opportunity to work across multiple projects across our client base significantly increasing your skills and exposure in the industry.
- Great culture: Working on the latest Apple MacBook pro in our custom designed offices in the heart of leafy Jayanagar, we provide a peaceful and productive work environment close to shops, parks and metro station.
- Professional development: We invest heavily in professional development both technically, through training and guided certification pathways, and in consulting, through workshops in client engagement and communication. Growth in our organisation happens from the growth of our people.
upGrad is an online education platform building the careers of tomorrow by offering the most industry-relevant programs in an immersive learning experience. Our mission is to create a new digital-first learning experience to deliver tangible career impact to individuals at scale. upGrad currently offers programs in Data Science, Machine Learning, Product Management, Digital Marketing, and Entrepreneurship, etc. upGrad is looking for people passionate about management and education to help design learning programs for working professionals to stay sharp and stay relevant and help build the careers of tomorrow.