Who are we? We are incubators of high-quality, dedicated software engineering teams for our clients. We work with product organizations to help them scale or modernize their legacy technology solutions. We work with startups to help them operationalize their idea efficiently. Incubyte strives to find people who are passionate about coding, learning, and growing along with us. We work with a limited number of clients at a time on dedicated, long term commitments with an aim of bringing a product mindset into services. What we are looking for We’re looking to hire software craftspeople. People who are proud of the way they work and the code they write. People who believe in and are evangelists of extreme programming principles. High quality, motivated and passionate people who make great teams. We heavily believe in being a DevOps organization, where developers own the entire release cycle and thus get to work not only on programming languages but also on infrastructure technologies in the cloud. What you’ll be doing First, you will be writing tests. You’ll be writing self-explanatory, clean code. Your code will produce the same, predictable results, over and over again. You’ll be making frequent, small releases. You’ll be working in pairs. You’ll be doing peer code reviews. You will work in a product team. Building products and rapidly rolling out new features and fixes. You will be responsible for all aspects of development – from understanding requirements, writing stories, analyzing the technical approach to writing test cases, development, deployment, and fixes. You will own the entire stack from the front end to the back end to the infrastructure and DevOps pipelines. And, most importantly, you’ll be making a pledge that you’ll never stop learning! Skills you need in order to succeed in this role Most Important: Integrity of character, diligence and the commitment to do your best Technologies: Azure Data Factory MongoDB SSIS/Apache NiFi (Good to have) Python/Java SOAP/REST Web Services Stored Procedures SQL Test Driven Development Experience with: Data warehousing and data lake initiatives on the Azure cloud Cloud DevOps solutions and cloud data and application migration Database concepts and optimization of complex queries Database versioning, backups, restores and migration, and automation of the same Data security and integrity
SpringML is looking to hire a top-notch Senior Data Engineer who is passionate about working with data and using the latest distributed framework to process large dataset. As an Associate Data Engineer, your primary role will be to design and build data pipelines. You will be focused on helping client projects on data integration, data prep and implementing machine learning on datasets. In this role, you will work on some of the latest technologies, collaborate with partners on early win, consultative approach with clients, interact daily with executive leadership, and help build a great company. Chosen team members will be part of the core team and play a critical role in scaling up our emerging practice. RESPONSIBILITIES: Ability to work as a member of a team assigned to design and implement data integration solutions. Build Data pipelines using standard frameworks in Hadoop, Apache Beam and other open-source solutions. Learn quickly – ability to understand and rapidly comprehend new areas – functional and technical – and apply detailed and critical thinking to customer solutions. Propose design solutions and recommend best practices for large scale data analysis SKILLS: B.tech degree in computer science, mathematics or other relevant fields. 4+years of experience in ETL, Data Warehouse, Visualization and building data pipelines. Strong Programming skills – experience and expertise in one of the following: Java, Python, Scala, C. Proficient in big data/distributed computing frameworks such as Apache,Spark, Kafka, Experience with Agile implementation methodologies
As an MLOps Engineer, the resource will work collaboratively with Data Scientists and Dataengineers to deploy and operate systems. Resource will help automate and streamline ouroperations and processes. Resource will build and maintain tools for deployment, monitoring, andoperations. Resource will also troubleshoot and resolve issues in development, testing, andproduction environmentsResponsibilities:• Operate and maintain systems supporting the provisioning of new clients, applications, andfeatures.• Day-to-day monitoring of the Production service delivery environment to ensure all servicesand applications are operating optimally and SLAs are met.• Software deployment and configuration management in both QA and Productionenvironments.• Collaborate with Data Scientists and Data Engineers on feature development teams tocontainerize and build out deployment pipelines for new modules• Design, build and optimize applications containerization and orchestration with Docker andKubernetes and AWS or Azure• Automate applications and infrastructure deployments.• Produce build and deployment automation scripts to integrate between services• Be a subject matter expert on DevOps practices, CI/CD and Configuration Management withassigned engineering team.• Experience with one of the cloud computing platforms: Google Cloud, Amazon Web Service,Azure, Kubernetes.• Experience in MLFlow, Qubeflow, MLTracking, MLExperiments• Experience in big data technologies preferred: Hadoop, Hive, Spark, Kafka.• Knowledge of machine learning frameworks: Tensorflow, Caffe/Caffe2, Pytorch, Keras,MXNet, Scikit-Learn.Skills:• At least 3 years experience working with cloud-base services and DevOps concepts, tools andPractices• Extensive experience with Unix/AIX/Linux environments• Experience with Kubernetes or Docker Swarm• Experience working in cross-functional Agile engineering teams• Familiarity with standard concepts and technologies used in CI/CD build, deploymentPipelines• Experience with scripting and coding using Python, Shell• Experience with configuration using tools such as Chef, Ansible.
Job title: Talend Developer Summary: This Talend developer will design and develop new functional components,work with other developers and be involved in all phases of the software development life cycleLocation: Pune, IndiaEducation: Degree level in Computer ScienceDevelopment Experience: Minimum 3 yearsIs this you?● I am passionate about designing and developing efficient high quality software .● I have a keen eye for detail and love solving problems.● I strive to work with a diverse, highly skilled team based in the UK and India.● I am fluent in English, both written and spoken.Responsibilities● Design, develop, deploy and maintain solutions on premise or in the cloud.● Integrate software components with third-party systems.● Troubleshoot, debug and maintain existing software.● Recommend and execute improvements to solutions and processes.● Create technical documentation for reference and reporting.Qualifications● Experience developing with the Talend product suite.● Experience with Java 8+, SQL, No SQL, JSON, XML, XPath and regular expressions.● Experience with REST API development.● Experience with design patterns and the Software Development Life Cycle.● Experience with ETL, ESB, MDM, Data Quality and Data Profiling.● Experience with designing logical and physical data models for relational and hierarchicaldata structures.● Knowledge of ActiveMQ or other messaging framework.● Knowledge of Cloud platforms and services for AWS, Azure or GCP.● Knowledge of automated unit testing and integration testing.● Knowledge of search frameworks (i.e. Elasticsearch, Solr, Lucene).● Knowledge of the Apache Camel and Apache Spark.Competencies● Excellent written and verbal communication skills in English and Hindi.● Excellent interpersonal skills to collaborate with various stakeholders.● Identifying the right questions and understanding the big picture.● A quick learner who enjoys new challenges.● A proactive Self-Starter with excellent time management skills.Benefits● Excellent work life balance, including flexible working hours within core working hours.● Actively involved in decision making at all levels.● Assigned mentor for self-development.● 18 days annual leave.● Medical Insurance and Provident Fund.We want to know more about you…Please prepare your CV using the Europass format:● https://europa.eu/europass/eportfolio/screen/cv-editor?lang=enPlease answer the following questions:1. Why should Onepoint consider you for an interview?2. Why would you like to work at Onepoint?3. What virtue, do you value the most at work and why?4. How do you keep up to date with the technology that is constantly changing?5. Where do you want to be in 2 to 3 years in terms of your career development?6. What is your current and yearly salary expectation?7. What is your notice period.Onepoint was established in 2005 in London to specialise in enterprise architecture consulting and open source solutions. Today we equip our clients to achieve transformational business outcomes powered by digital advances by applying world-class technology, data, and analytics expertise. We are proud to be called trusted partners by our clients. We love to provide value in whatever we do.For further information consult: www.onepointltd.com
Job Description:• Help build a Data Science team which will be engaged in researching, designing,implementing, and deploying full-stack scalable data analytics vision and machine learningsolutions to challenge various business issues.• Modelling complex algorithms, discovering insights and identifying businessopportunities through the use of algorithmic, statistical, visualization, and mining techniques• Translates business requirements into quick prototypes and enable thedevelopment of big data capabilities driving business outcomes• Responsible for data governance and defining data collection and collationguidelines.• Must be able to advice, guide and train other junior data engineers in their job.Must Have:• 4+ experience in a leadership role as a Data Scientist• Preferably from retail, Manufacturing, Healthcare industry(not mandatory)• Willing to work from scratch and build up a team of Data Scientists• Open for taking up the challenges with end to end ownership• Confident with excellent communication skills along with a good decision maker
What You'll Do :- Develop analytic tools, working on BigData and Distributed Environment. Scalability will be the key- Provide architectural and technical leadership on developing our core Analytic platform- Lead development efforts on product features on Java- Help scale our mobile platform as we experience massive growthWhat we Need :- Passion to build analytics & personalisation platform at scale- 3 to 9 years of software engineering experience with product based company in data analytics/big data domain- Passion for the Designing and development from the scratch.- Expert level Java programming and experience leading full lifecycle of application Dev.- Exp in Analytics, Hadoop, Pig, Hive, Mapreduce, ElasticSearch, MongoDB is an additional advantage- Strong communication skills, verbal and written
Interpret data, analyze results using statistical techniques and provide ongoing reports Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality Acquire data from primary or secondary data sources and maintain databases/data systems Identify, analyze, and interpret trends or patterns in complex data sets Filter and “clean” data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems Work with management to prioritize business and information needs Locate and define new process improvement opportunities
PreferredSkills- • Should have minimum 3 years of experience in Software development • Strong experience in spark Scala development • Person should have strong experience in AWS cloud platform services • Should have good knowledge and exposure in Amazon EMR, EC2 • Should be good in over databases like dynamodb, snowflake
As a Senior Consultant Analytics, you shall gather, analyze, identify and lead actionable data insights to solve real complex challenges. Your work will directly influence decisions taken to build the most successful web/mobile gaming platform with games played by millions across the world. Responsibilities Leverage complex data algorithms and numerical methods for forecasting, simulation, and predictive analysis, designed to optimize key decisions and metrics such as retention, engagement, and operations. Work with all teams in the company to understand, plan and drive data-based actionable decisions, championing the Data Science culture. Analyze consumer data to drive insight on how to target effectively, retain customers at low cost, and optimize engagement metrics. Develop a reporting process for all KPIs. Recruit, develop and coach talent. Champion awareness of data indicators within the organization and teach everyone to proactively identify data patterns, growth trends, and areas for improvement. Qualifications B.tech or Bachelor degree in Data science. An MBA or advanced degree in a quantitative discipline is preferred. 3+ years in Data Science and Analytics execution, with specific experience in all stages - gathering, hygiene, and analysis of data resulting in actionable insights for an end-to-end product, marketing and business operations. Knowledge of advanced analytics including but not limited to Regression, GLMs, survival modeling, predictive analytics, forecasting, machine learning, decision trees etc. Past experience in at least one Statistical and analytic tools or language expertise R, Python, SAS etc.. Expertise in analysis Algorithms, Numerical Methods and Data Tactics used to drive operational excellence, user engagement and retention. Expertise in at least one visualization platforms such as Tableau, QlikView, Spotfire and Excel. Tableau skills preferred. Advanced SQL is mandatory skills. Ability to creatively solve business problems through innovative approaches. Ability to work with various teams in a complex environment, ensuring timely delivery of multiple projects. Highly analytical with the ability to collate, analyze and present data, and drive clear insights into decisions that improve KPIs. Ability to effectively communicate and manage relationships with senior management, company divisions and partners.
Your Role: · As an integral part of the Data Engineering team, be involved in the entire development lifecycle from conceptualization to architecture to coding to unit testing · Build realtime and batch analytics platform for analytics & machine-learning · Design, propose and develop solutions keeping the growing scale & business requirements in mind · Help us design the Data Model for our data warehouse and other data engineering solutions Must Have: · Understands Data very well and has extensive Data Modelling experience · Deep understanding of real-time as well as batch processing big data technologies (Spark, Storm, Kafka, Flink, MapReduce, Yarn, Pig, Hive, HDFS, Oozie etc) · Experience developing applications that work with NoSQL stores (e.g., ElasticSearch, Hbase, Cassandra, MongoDB, CouchDB) · Proven programming experience in Java or Scala · Experience in gathering and processing raw data at scale including writing scripts, web scraping, calling APIs, writing SQL queries, etc · Experience in cloud based data stores like Redshift and Big Query is an advantage Bonus: · Love sports – especially cricket and football · Have worked previously in a high-growth tech startup