50+ Remote SQL Jobs in India
Apply to 50+ Remote SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!
a leading Data & Analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage. Our teams offer solutions and services at the intersection of Advanced-Data, Analytics, and AI.
Experience - 4+ years of relevant experience or even used Delphix for a project
Location - Bangalore/Mangalore/Remote
Skills – Delphix, Data Masking, ETL, SQL, Data migration
Key Skills:
• Strong knowledge of Delphix Data Platform (virtualization and masking).
• Experience in database management and data anonymization techniques.
• Familiarity with CI/CD pipelines and DevOps practices.
• Expertise in SQL and knowledge on data migration.
About the company
Adia makes clinicians better diagnosticians. Adia Health revolutionizes clinical decision support by enhancing diagnostic accuracy and personalizing care. It modernizes the diagnostic process by automating optimal lab test selection and interpretation, utilizing a combination of expert medical insights, real-world data, and artificial intelligence. This approach not only streamlines the diagnostic journey but also ensures precise, individualized patient care by integrating comprehensive medical histories and collective platform knowledge.
Position Overview
We are seeking a highly skilled Backend Engineer specializing in integrations and platform development to join our dynamic team. The ideal candidate will have a background working in a complex domain, and have a proven track record of success. This role requires a deep understanding of backend technologies, strong problem-solving skills, and the ability to collaborate effectively with cross-functional teams.
Key Responsibilities
- Design, implement, and maintain scalable and secure integrations with third-party systems and APIs to enable seamless data exchange and functionality.
- Develop and maintain internal platform services and APIs to support various product features and business requirements.
- Collaborate with cross functional teams to ensure smooth integration of backend services with user-facing applications.
- Work closely with product managers and stakeholders to understand integration requirements and translate them into technical solutions.
- Identify opportunities for performance optimization, scalability improvements, and system enhancements within the integration and platform infrastructure.
- Implement monitoring, logging, and alerting solutions to ensure the reliability and availability of integration services and platform components.
- Experience with HL7/FHIR is a huge plus.
Qualifications
- Bachlor's degree in Computer Science, Engineering, or a related field
- Proven experience (4+ years) in backend development with expertise in building integrations and platform services
- Proficiency in Node.js, JavaScript, TypeScript, MongoDB, SQL, noSQL, AWS (or other cloud providers like GCP or Azure)
- Strong problem solving skills and the ability to collaborate effectively with cross-functional teams in an agile enviornment
- Experience working in a complex domain, ideally U.S. Healthcare
- English fluency required
Who We Are 🌟
We are a company where the ‘HOW’ of building software is just as important as the ‘WHAT’. Embracing Software Craftsmanship values and eXtreme Programming Practices, we create well-crafted products for our clients. We partner with large organizations to help modernize their legacy code bases and work with startups to launch MVPs, scale or as extensions of their team to efficiently operationalize their ideas. We love to work with folks who are passionate about creating exceptional software, are continuous learners, and are painstakingly fussy about quality.
Our Values 💡
•Relentless Pursuit of Quality with Pragmatism
•Extreme Ownership
•Proactive Collaboration
•Active Pursuit of Mastery
•Effective Feedback
•Client Success
What We’re Looking For 👀
We’re looking to hire software craftspeople and data engineers. People who are proud of the way they work and the code they write. People who believe in and are evangelists of extreme programming principles. High quality, motivated and passionate people who make great teams. We heavily believe in being a DevOps organization, where developers own the entire release cycle including infrastructure technologies in the cloud
What You’ll Be Doing 💻
Collaborate with teams across the organization, including product managers, data engineers and business leaders to translate requirements into software solutions to process large amounts of data.
- Develop new ways to ensure ETL and data processes are running efficiently.
- Write clean, maintainable, and reusable code that adheres to best practices and coding standards.
- Conduct thorough code reviews and provide constructive feedback to ensure high-quality codebase.
- Optimize software performance and ensure scalability and reliability.
- Stay up-to-date with the latest trends and advancements in data processing and ETL development and apply them to enhance our products.
- Meet with product owners and other stakeholders weekly to discuss priorities and project requirements.
- Ensure deployment of new code is tested thoroughly and has business sign off from stakeholders as well as senior leadership.
- Handle all incoming support requests and errors in a timely manner and within the necessary time frame and timezone commitments to the business.
Location : Remote
Skills you need in order to succeed in this role
What you will bring:
- 7+ years of experience with Java 11+(required), managing and working in Maven projects
- 2+ years of experience with Python (required)
- Knowledge and understanding of complex data pipelines utilizing ETL processes (required)
- 4+ years of experience using relational databases and deep knowledge of SQL with the ability to understand complex data relationships and transformations (required)
- Knowledge and understanding of Git (required)
- 3+ year of experience with various GCP technologies
- Google Dataflow (Apache Beam SDK) (equivalent Hadoop technologies)
- BigQuery (equivalent of any data warehouse technologies: Snowflake, Azure DW, Redshift)
- Cloud Storage Buckets (equivalent to S3)
- GCloud CLI
- Experience with Apache Airflow / Google Composer
- Knowledge and understanding of Docker, Linux, Shell/Bash and virtualization technologies
- Knowledge and understanding of CI/CD methodologies
- Ability to understand and build UML diagrams to showcase complex logic
- Experience with various organization/code tools such as Jira, Confluence and GitHub
Bonus Points for Tech Enthusiasts:
- Infrastructure as Code technologies (Pulumi, Terraform, CloudFormation)
- Experience with observability and logging platforms (DataDog)
- Experience with DBT or similar technologies
We are seeking a Senior or Staff Software Engineer (Node.js, Azure and React ) to lead new software development initiatives to join our team.
Responsibilities:
- Contribute hands-on to coding, code reviews, architecture, and design efforts, setting a solid example for the team.
- Will be a tech lead and manage a small engineering team, fostering a collaborative and productive work environment.
- Work closely with cross-functional teams, including Product Management and Data Engineering, to build empathetic and user-centric products.
- Drive the development of robust and scalable web experiences, leveraging modern technologies and best practices.
- Provide technical guidance and mentorship to team members, promoting continuous learning and growth.
- Collaborate with stakeholders to define and prioritize engineering initiatives aligned with business goals.
- Ensure high code quality, maintainability, and performance through the implementation of best practices and coding standards.
- Foster a culture of innovation, encouraging the team to explore new technologies and approaches to problem-solving.
Requirements:
- Bachelor’s degree in computer science, Software Engineering, or a related field (advanced degree preferred).
- 7+ years of professional experience in software engineering, with at least 1 year in a technical leadership role.
- Strong experience in product engineering, with a focus on building empathetic and user-centric products.
- Extensive experience in web development, particularly with technologies such as Node.js and React.
- Familiarity with cloud infrastructure, specifically Azure, and containerization technologies like Docker.
- Solid understanding of software development best practices, design patterns, and coding standards.
- Excellent problem-solving and analytical skills, with the ability to make data-driven decisions.
- Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams.
- Experience with Agile development methodologies (e.g., Scrum, Kanban).
Preferred Qualifications:
- Experience with web scraping techniques and tools.
- Knowledge of SQL query optimization and performance tuning.
- Familiarity with automated testing, continuous integration, and continuous deployment (CI/CD) practices.
- Experience with DevOps practices and tools (e.g., Jenkins, Ansible, Terraform).
Benefits:
- Work Location: Remote
- 5 days working
You can apply directly through the link: https://zrec.in/es8UJ?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com
We are seeking a Junior Software Engineer (AWS, Azure, Google Cloud,Spring, Node.js, Django) to join our dynamic team. As a Junior Software Engineer will have a passion for technology, a solid understanding of software development principles, and a desire to learn and grow in a collaborative environment. You will work closely with senior engineers to develop, test, and maintain software solutions that meet the needs of our clients and internal stakeholders.
Responsibilties:
- Software Development: Write clean, efficient, and well-documented code for various software applications.
- Testing & Debugging: Assist in testing and debugging software to ensure functionality, performance, and security.
- Learning & Development: Continuously improve your technical skills by learning new programming languages, tools, and AI methodologies.
- Documentation: Assist in the documentation of software designs, technical specifications, and user manuals.
- Problem-Solving: Identify and troubleshoot software defects and performance issues.
- Customer Communication: Interact with customers to gather requirements, provide technical support, and ensure their needs are met throughout the software development lifecycle. Maintain a professional and customer-focused attitude in all communications.
Requirements:
- Education: Bachelor's degree in Computer Science, Software Engineering, or a related field.
- Programming Languages: Proficiency in at least one programming language such as Java, Python, TypeScript or JavaScript.
- Familiarity with: Git version control system, Scrum software development methodology, and basic understanding of databases and SQL.
- Problem-Solving Skills: Strong analytical and problem-solving skills with a keen attention to detail.
- Communication: Good verbal and written communication skills with the ability to work effectively in a team environment and interact with customers.
- Adaptability: Ability to learn new technologies and adapt to changing project requirements.
- Internship/Project Experience: Previous internship experience or project work related to software development is a plus.
Preferred Skills:
- Experience with cloud platforms (e.g., AWS, Azure, Google Cloud).
- Familiarity with back-end frameworks (e.g., Spring, Node.js, Django).
- Knowledge of DevOps practices and tools.
Benefits:
- Work Location: Remote
- 5 days wortking
You can apply directly through the link: https://zrec.in/F57mD?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com
We are seeking a Data Engineer ( Snowflake, Bigquery, Redshift) to join our team. In this role, you will be responsible for the development and maintenance of fault-tolerant pipelines, including multiple database systems.
Responsibilities:
- Collaborate with engineering teams to create REST API-based pipelines for large-scale MarTech systems, optimizing for performance and reliability.
- Develop comprehensive data quality testing procedures to ensure the integrity and accuracy of data across all pipelines.
- Build scalable dbt models and configuration files, leveraging best practices for efficient data transformation and analysis.
- Partner with lead data engineers in designing scalable data models.
- Conduct thorough debugging and root cause analysis for complex data pipeline issues, implementing effective solutions and optimizations.
- Follow and adhere to group's standards such as SLAs, code styles, and deployment processes.
- Anticipate breaking changes to implement backwards compatibility strategies regarding API schema changesAssist the team in monitoring pipeline health via observability tools and metrics.
- Participate in refactoring efforts as platform application needs evolve over time.
Requirements:
- Bachelor's degree or higher in Computer Science, Engineering, Mathematics, or a related field.
- 3+ years of professional experience with a cloud database such as Snowflake, Bigquery, Redshift.
- +1 years of professional experience with dbt (cloud or core).
- Exposure to various data processing technologies such as OLAP and OLTP and their applications in real-world scenarios.
- Exposure to work cross-functionally with other teams such as Product, Customer Success, Platform Engineering.
- Familiarity with orchestration tools such as Dagster/Airflow.
- Familiarity with ETL/ELT tools such as dltHub/Meltano/Airbyte/Fivetran and DBT.
- High intermediate to advanced SQL skills (comfort with CTEs, window functions).
- Proficiency with Python and related libraries (e.g., pandas, sqlalchemy, psycopg2) for data manipulation, analysis, and automation.
Benefits:
- Work Location: Remote
- 5 days working
You can apply directly through the link:https://zrec.in/e9578?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com
We are seeking a Senior Software Engineer (.NET, HTML5, and CSS) to join our team.
Responsibilities:
- Full stack development of web applications including projects ranging between data tiers, server-side, APIs, and front-end.
- Solve moderate to complex problems with minimal guidance and support.
- Help guide the progress of projects and tickets through the use of TechDev’s project and task management systems.
- Participate in release planning, support the success of released projects.
- Propose architectural directions when involved in planning projects.
- Ensure documentation and communication needs for projects are satisfied.
- Provide research, prototyping, and product/library exploration as requested – helping the TechDev team choose the best fits for technology.
- Production of automated testing as needed (including unit tests and end-to-end testing).
- Monitor the quality and security of projects with the use of static code analysis tools such as SonarCube.
- Respond to, troubleshoot, and resolve defects and outages in WLT software. This includes being able to respond to emergencies quickly if needed.
- Mentor and provide guidance to other Developers, perform constructive code reviews.
- Learn continuously and stay up-to-date with trends, technologies and direction in the technology industry and help surface. recommendations for Tech Dev, its processes, and its projects.
- Understand and display WLT’s values.
- Other duties as assigned.
Requirements:
- Ability to produce responsive and mobile-first front-ends using modern best practices and frameworks.
- Proficiency in technologies including but not limited to:.NET, HTML5, CSS, JavaScript, Angular, Svelte, SQL and non-relational DBs.
- Ability to be pragmatic in decision-making
- Comfort with implementation and management of packages and libraries to enhance software products (eg. Tailwind, PrimeNG, and others).
- Ability to juggle multiple priorities and respond dynamically as priorities change.
- Demonstrate a passion for learning new technologies and staying current.
- Strong time management capability, ability to estimate project scopes accurately, and adhere to timelines.
- Understands the “Big Picture” and has an entrepreneur way of thinking.
- Detailed knowledge of various browser capabilities, technologies, and good web design practices.
- Comfortable both architecting and implementing solutions through a team.
- Understanding the fundamentals of behind a scalable application.
- Familiar with various design and architectural patterns.
- Fluent with modern DevOps patterns.
- Strong communication and collaboration skills.
- Ability to uphold WLT values.
Experience:
- 10+ years hands on experience building dynamic web application using:.NET C#, JavaScript, CSS, Web APIs
- Experience with JavaScript front end frameworks such as Angular or Svelte
- Strong mentoring and interpersonal skills are required
- Experience with working on an agile development team
- Good understanding of databases, tools and techniques used for object to relational mapping, experience in performance tuning. Experience in technologies such as Microsoft SQL Server, SQL Azure, Entity Framework, other ORMs, and non-relational data stores.
- Experience integrating off-the-shelf solutions, understand build vs. buy decisions
- Experience with Azure DevOps, GIT, and Visual Studio for task and source code management – including CI and GIT branching strategies
- Experience with Microsoft Azure or similar cloud platforms
- Proficient in object-oriented design and development. Knowledge of common architectural patterns, SOLID principals, OWASP top-ten, and industry accepted best practices.
- Experience with Education technology and Learning Managements Systems (LMSs) a plus
Education or Certification:
- Bachelor’s degree in computer science, software development or equivalent experience.
Benefits:
- Work Location: Remote
- 5 days working
You can apply directly through the link: https://zrec.in/1lunY?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com
We are seeking an Application Developer/Software Engineer with strong technical experience in all phases of the software development life cycle (SDLC) with a demonstrated technical expertise in one or more areas of state-of-the-art software development technology.
Responsibilties:
- Provides activities related to enterprise full life-cycle software development projects.
- Develops detailed functional and technical requirements for client-server and web software applications.
- Conduct detailed analyses and module-level specification development of software requirements.
- Define and implement high-performance and highly scalable product/application architectures and lead operational, tactical, and strategic integration activities.
- Perform complex programming and analysis for web and mobile applications and ETL processing; define requirements; write program specifications; design, code, test, and debug programming assignments; document programs.
- Supervise the efforts of other developers in major system development projects; determine and analyze functional requirements; determine proposed solutions information processing requirements; and optimize system performance.
- The work task could include total custom development, customization as needed for COTS, report development, data conversion, and support of legacy applications.
Requirements:
- Can code at an intermediate or expert level in applications such as C#, ASP.NET, .NET Core, SQL, Python, Java, React, TypeScript, CSS/JavaScript, Git, Azure, Knockout, MarkLogic, ORACLE, etc.
- 4+ years’ experience or specific educational background sufficient to demonstrate competency with Microsoft technology, including ASP.
- Experience with Artificial Intelligence (AI)/Machine Learning (ML), SharePoint.
- knowledge of HTML, XHTML, XML, XSLT, .NET Framework, Visual Studio, JavaScript.
- 4+ years with Cloud technologies such as Azure / AWS / Google Cloud.
- Proficient with appropriate programming languages, particularly ASP.
- NET and modern web frameworks like React.
- Comfortable with Object Oriented Programming and Software Patterns.
- Excellent interpersonal skills.
- High motivation and ability to work with teams to meet project objectives.
- Ability to work on multiple projects simultaneously.
- Ability to meet project deadlines and goals without management supervision.
- Awareness of database design concepts and proficiency in a general cloud environment.
Educational Requirements:
- BS in a field related to computer science or information systems, or advanced degree, or additional specific training and/or certification in 4th generation computing language.
- Must be able to define and implement high-performance and highly scalable product/application architectures, and able to lead integration activities for operational, tactical, and strategic systems.
- Able to develop detailed functional and technical requirements for client-server and web software applications and conduct detailed analyses and module-level specification development of software requirements.
Benefits:
- Work Location: Remote
- 5 days working
You can apply directly through the link : https://zrec.in/RlUkC?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com
You will be working hands-on on a complex and compound product that has the potential to be used by millions of sales and marketing people around the world. Your contribution to delivering an excellent product platform that:
- enables quick iteration
- supports product customization
- and handles scale
What do we expect you to have?
- 2+ years of experience in backend engineering
- An intent to learn and an urge to build a product by learning different technologies
- Interest in writing complex, scalable, and maintainable backend applications
- Tech stack requirements:
Must haves
- Experience in building application server in Java (Spring / Spring boot) / NodeJS / Golang / Python
- Experience in using SQL databases and designing schemas based on application need
- Experience with container services and runtimes (docker / docker-compose / k8s)
- Experience with cloud paas (AWS / GCP / Azure cloud)
- Experience and familiarity with microservices’ concepts
- Experience with bash scripting
Good to have (Preferred)
- Preferred experience with org wide message queue (rabbitmq / aws sqs)
- Preferred experience with task orchestration services (apache airflow / aws step function)
- Preferred experience with infra as code (or system configuration) tools (terraform / chef / ansible)
- Preferred experience with build essential tools (make / makefile)
- Preferred experience with monitoring and tracing systems for performance / system / application monitoring (grafana + loki + prometheus / aws cloudwatch)
What will you learn?
- Building highly available, complex, compound, performant systems of microservices platform that acts as an API layer
- Industry-standard state-of-the-art tools + methodologies + frameworks + infra for building a product.
- Fable is not a trivial CRUD app. It requires a lot of consideration and care for building the API layer as the product is highly customizable per user.
- How different functions (sales, marketing, product, engineering) in a high-velocity product company work in synergy to deliver an iterative product in real life.
Who would you be working with?
- You would be directly working with the co-founder & CTO who has built multiple companies before and has built large teams in large-scale companies like ThoughtSpot, Unacademy, etc.
Position details
- Fully remote.
- 5 days/week (all public and government holidays will be non-working days).
- No specific work hours (we will sync over zoom over the course of the day).
Thirumoolar IT Solutions is looking for a motivated and enthusiastic Fresher Trained Dataset Engineer to join our team. This entry-level position is ideal for recent graduates who are eager to apply their academic knowledge in a practical setting and contribute to the development of high-quality datasets for machine learning applications.
Responsibilities
Assist in the collection, cleaning, and preprocessing of data to ensure it is ready for training machine learning models.
Collaborate with senior dataset engineers and data scientists to understand the requirements for specific machine learning tasks.
Participate in the annotation and labeling of datasets, ensuring accuracy and consistency in data representation.
Conduct quality checks on datasets to identify and rectify errors or inconsistencies.
Support the development of documentation and guidelines for data annotation processes.
Stay updated with the latest tools and techniques in data processing and machine learning.
Skills and Qualifications
Bachelor’s degree in Computer Science, Data Science, Mathematics, or a related field.
Basic understanding of machine learning concepts and the importance of high-quality datasets.
Familiarity with programming languages such as Python or R is a plus.
Knowledge of data manipulation libraries (e.g., Pandas, NumPy) is advantageous.
Strong analytical skills and attention to detail.
Excellent communication and teamwork abilities.
A passion for learning and a desire to grow in the field of data engineering.
Preferred Location
Candidates based in Tamil Nadu or those willing to work from home are encouraged to apply.
We are seeking a talented and experienced ServiceNow Developer to join our dynamic team. The ideal candidate will have a strong background in developing and customizing ServiceNow applications, with a deep understanding of the platform's modules and capabilities. This role offers an exciting opportunity to work on diverse projects and collaborate with leading industry professionals.
Key Responsibilities:
- Develop and customize ServiceNow applications and modules based on business requirements.
- Create and manage ServiceNow business rules, script includes, UI actions, and other scripting elements.
- Design and implement ServiceNow integrations using REST and SOAP web services.
- Configure and customize ServiceNow forms, lists, reports, and dashboards.
- Utilize ServiceNow Flow Designer and Orchestration to automate workflows.
- Collaborate with cross-functional teams to gather requirements and deliver effective solutions.
- Troubleshoot and resolve technical issues related to ServiceNow.
- Maintain system documentation and user guides.
Requirements:
- Proficiency in ServiceNow modules, including ITSM, ITOM, ITBM, and ITAM.
- Strong knowledge of JavaScript for client-side and server-side scripting.
- Experience with ServiceNow Studio, Flow Designer, and IntegrationHub.
- Familiarity with REST and SOAP web services for system integrations.
- Understanding of relational databases and SQL.
- ServiceNow Certified System Administrator and/or ServiceNow Certified Application Developer (preferred).
- Previous experience in a ServiceNow Developer role with a track record of successful projects.
- Strong problem-solving skills and the ability to communicate technical concepts to non-technical stakeholders.
- Experience with Agile methodologies and project management principles.
Preferred Qualifications:
- Experience with other ITSM tools and platforms.
- Additional ServiceNow certifications (e.g., Certified Implementation Specialist) are a plus.
- A degree in Computer Science, Information Technology, or a related field.
About the Role
We are actively seeking talented Senior Python Developers to join our ambitious team dedicated to pushing the frontiers of AI technology. This opportunity is tailored for professionals who thrive on developing innovative solutions and who aspire to be at the forefront of AI advancements. You will work with different companies in the US who are looking to develop both commercial and research AI solutions.
Required Skills:
- Write effective Python code to tackle complex issues
- Use business sense and analytical abilities to glean valuable insights from public databases
- Clearly express the reasoning and logic when writing code in Jupyter notebooks or other suitable mediums
- Extensive experience working with Python
- Proficiency with the language's syntax and conventions
- Previous experience tackling algorithmic problems
- Nice to have some prior Software Quality Assurance and Test Planning experience
- Excellent spoken and written English communication skills
The ideal candidates should be able to
- Clearly explain their strategies for problem-solving.
- Design practical solutions in code.
- Develop test cases to validate their solutions.
- Debug and refine their solutions for improvement.
TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes Tvarit one of the most innovative AI companies in Germany and Europe.
We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.
We are seeking a skilled and motivated Data Engineer from the manufacturing Industry with over two years of experience to join our team. As a data engineer, you will be responsible for designing, building, and maintaining the infrastructure required for the collection, storage, processing, and analysis of large and complex data sets. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.
Skills Required
- Experience in the manufacturing industry (metal industry is a plus)
- 2+ years of experience as a Data Engineer
- Experience in data cleaning & structuring and data manipulation
- ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
- Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
- Experience in SQL and data structures
- Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache and NoSQL databases.
- Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
- Proficient in data management and data governance
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
Nice To Have
- Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
- Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes TVARIT one of the most innovative AI companies in Germany and Europe.
We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.
We are seeking a skilled and motivated senior Data Engineer from the manufacturing Industry with over four years of experience to join our team. The Senior Data Engineer will oversee the department’s data infrastructure, including developing a data model, integrating large amounts of data from different systems, building & enhancing a data lake-house & subsequent analytics environment, and writing scripts to facilitate data analysis. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.
Skills Required:
- Experience in the manufacturing industry (metal industry is a plus)
- 4+ years of experience as a Data Engineer
- Experience in data cleaning & structuring and data manipulation
- Architect and optimize complex data pipelines, leading the design and implementation of scalable data infrastructure, and ensuring data quality and reliability at scale
- ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
- Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
- Experience in SQL and data structures
- Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache, and NoSQL databases.
- Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
- Proficient in data management and data governance
- Strong analytical experience & skills that can extract actionable insights from raw data to help improve the business.
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
Nice To Have:
- Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
- Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
- Bachelor’s degree in computer science, Information Technology, Engineering, or a related field from top-tier Indian Institutes of Information Technology (IIITs).
- Benefits And Perks
- A culture that fosters innovation, creativity, continuous learning, and resilience
- Progressive leave policy promoting work-life balance
- Mentorship opportunities with highly qualified internal resources and industry-driven programs
- Multicultural peer groups and supportive workplace policies
- Annual workcation program allowing you to work from various scenic locations
- Experience the unique environment of a dynamic start-up
Why should you join TVARIT ?
Working at TVARIT, a deep-tech German IT startup, offers a unique blend of innovation, collaboration, and growth opportunities. We seek individuals eager to adapt and thrive in a rapidly evolving environment.
If this opportunity excites you and aligns with your career aspirations, we encourage you to apply today!
Who are we?
We are incubators of high-quality, dedicated software engineering teams for our clients. We work with product organizations to help them scale or modernize their legacy technology solutions. We work with startups to help them operationalize their idea efficiently. Incubyte strives to find people who are passionate about coding, learning, and growing along with us. We work with a limited number of clients at a time on dedicated, long term commitments with an aim of bringing a product mindset into services.
What we are looking for
We’re looking to hire software craftspeople. People who are proud of the way they work and the code they write. People who believe in and are evangelists of extreme programming principles. High quality, motivated and passionate people who make great teams. We heavily believe in being a DevOps organization, where developers own the entire release cycle and thus get to work not only on programming languages but also on infrastructure technologies in the cloud.
What you’ll be doing
First, you will be writing tests. You’ll be writing self-explanatory, clean code. Your code will produce the same, predictable results, over and over again. You’ll be making frequent, small releases. You’ll be working in pairs. You’ll be doing peer code reviews.
You will work in a product team. Building products and rapidly rolling out new features and fixes.
You will be responsible for all aspects of development – from understanding requirements, writing stories, analyzing the technical approach to writing test cases, development, deployment, and fixes. You will own the entire stack from the front end to the back end to the infrastructure and DevOps pipelines. And, most importantly, you’ll be making a pledge that you’ll never stop learning!
Skills you need in order to succeed in this role
Most Important: Integrity of character, diligence and the commitment to do your best
Must Have: SQL, Databricks, (Scala / Pyspark), Azure Data Factory, Test Driven Development
Nice to Have: SSIS, Power BI, Kafka, Data Modeling, Data Warehousing
Self-Learner: You must be extremely hands-on and obsessive about delivering clean code
- Sense of Ownership: Do whatever it takes to meet development timelines
- Experience in creating end to end data pipeline
- Experience in Azure Data Factory (ADF) creating multiple pipelines and activities using Azure for full and incremental data loads into Azure Data Lake Store and Azure SQL DW
- Working experience in Databricks
- Strong in BI/DW/Datalake Architecture, design and ETL
- Strong in Requirement Analysis, Data Analysis, Data Modeling capabilities
- Experience in object-oriented programming, data structures, algorithms and software engineering
- Experience working in Agile and Extreme Programming methodologies in a continuous deployment environment.
- Interest in mastering technologies like, relational DBMS, TDD, CI tools like Azure devops, complexity analysis and performance
- Working knowledge of server configuration / deployment
- Experience using source control and bug tracking systems,
writing user stories and technical documentation
- Strong in Requirement Analysis, Data Analysis, Data Modeling capabilities
- Expertise in creating tables, procedures, functions, triggers, indexes, views, joins and optimization of complex
- Experience with database versioning, backups, restores and
- Expertise in data security and
- Ability to perform database performance tuning queries
We are a technology company operating in the media space. We are the pioneers of robot journalism in India. We use the mix of AI-generated and human-edited content, across media formats, be it audio, video or text.
Our key products include India’s first explanatory journalism portal (NewsBytes), a content platform for developers (DevBytes), and a SaaS platform for content creators (YANTRA).
Our B2C media products are consumed by more than 50 million users in a month, while our AI-driven B2B content engine helps companies create text-based content at scale.
The company was started by IIT, IIM Ahmedabad alumni and Cornell University. It has raised institutional financing from well-renowned media-tech VC and a Germany-based media conglomerate.
We are hiring a talented DevOps Engineer with 3+ years of experience to join our team. If you're excited to be part of a winning team, we are a great place to grow your career.
Responsibilities
● Handle and optimise cloud (servers and CDN)
● Build monitoring tools for the infrastructure
● Perform a granular level of analysis and optimise usage
● Help migrate from a single cloud environment to multi-cloud strategy
● Monitor threats and explore building a protection layer
● Develop scripts to automate certain aspects of the deployment process
Requirements and Skills
● 0-2 years of experience as a DevOps Engineer
● Proficient with AWS and GCP
● A certification from relevant cloud companies
● Knowledge of PHP will be an advantage
● Working knowledge of databases and SQL
This role will support FinTech Product Team with various ongoing change initiatives and day-to-day product operations such as
a) Testing and data analysis of platform and financial accounting datab) Campaign management for hosts and guests
c) CS Ticket deep dives informing product functionality for example.
Requirements:
- Experienced in core financial accounting concepts under U.S. GAAP accounting principles
- Ability to analyze and interpret large and complex datasets
- Strong attention to detail and ability to document test procedures and outcomes
- Demonstrated ability to work both independently and collaboratively
- Ability to write and execute SQL queries, Python Scripts
- Skilled in Spreadsheet application (e.g. Microsoft Excel, Google Sheets) for data analysis functionality - Pivot tables, VLOOKUP, formulas, etc.
- Experience in Quote-to-Cash business processes is a plus
- Experience and familiarity with Oracle Financial Applications is a plus - particularly Oracle Financials General Ledger, Subledgers, Financial Accounting Hub.
- Familiarity with Big Data Systems (Presto etc.) is a plus
Our Work Culture
We constantly strive and take pride in building a productive, diverse and stress-free work environment for our employees. We take keen interest in ensuring maximum work life balance for our employees. To achieve this, we offer benefits like –
Additional Performance based perks, Insurance Benefits, Generous Leaves and Vacations, Informal Office Outings and many more.
To ensure holistic development of the employees we also conduct workshops like personality development, stress handling, leadership and confidence building etc.
If you think your values align with the vision of the company, kindly proceed with filling out the Job Application form and we will be happy to interact with you.
Good Luck!
Let’s develop together!
At Jules we’re proud to offer innovative solutions that meet the needs of the recycled materials industry and manufacturing supply chains. We aim to build, acquire or invest in industry start-ups.
As a developer at Jules, you will have the responsibility to work with the product manager and the rest of the scrum team to design, sprint and develop features for the end user that are production ready.
At Jules developers are independent and manage identically front-end, back-end, database, devops, scripts, etc. The tech should be a tool to help you design and create the best product. Eventually, you shouldn’t be afraid to tackle technical debt, contribute to open source projects and be proud of your technical expertise.
What you will do in this role
- Work with the product manager and the UX designer to validate the features that need to be developed
- Estimating the work to be done and prioritizing the most relevant features first
- Challenging as much as possible the work to be done in order to release an mvp as fast as possible
- Making sure that your features are production ready and monitoring them in the right environments
- Quality test your team’s features in order to make sure that user flows are bug free
- Helping the team design the technical strategy of the whole sprint
- Creating end to end tests to minimize the time to push to production
- Making sure that your features are performant in a production environment
- Helping everyone in the team to create high quality code
- Constantly improving
Who we are looking for
Skills and Qualifications:
- (must have) Typescript is must-have, you need to know what are interfaces, how to type generic functions, and understand typing libraries such as lodash.
- (must have) Functional programming (higher order functions, immutability, side effects, pure functions, etc)
- (must have) React OR React Native (hooks, side effect, optimization, callbacks, deep VS shallow comparison, etc)
- (must have) Any backend framework
- (must have) Any SQL language
- (nice to have) Graphql, Jest, UI Framework, OOP, SQL query optimization, Apollo Server, RabbitMQ, worker/slave architecture, microservice architecture, Docker, Terraform
Requirements and Skills
- At least 3 years of experience developing a product, ideally B2B
- Experience in our technical stack
- Adept of functional programming
- Experience working closely with product management and dev teams to deliver solutions.
- Fluent in English, and great communication skills
- Prior experience in B2B SaaS
Grow, develop and thrive with us
What we offer
Work closely with a global team helping bring market intelligence to the recycling world. As a part of our team, we look to foster relationships and help you grow with us.
You can also expect:
- As a global company, we treasure and encourage diversity, perspective, interest, and representation through inclusivity. The more we have, the better the solution.
- Connect and work with leading minds from the recycling industry and be part of a growing, energetic global team, across time zones, regions, offices and screens.
- Exposure to developments and tools within your field ensures evolution in your career and skill building.
- We adopt a Bring Your Own Device policy and encourage flexibility and freedom in how you work through competitive compensation and yearly appraisals
- Health insurance coverage, paid vacation days and flexible work hours help you maintain a work-life balance
- Have the opportunity to network and collaborate in a diverse community.
Apply directly to us: https://nyteco.keka.com/careers/jobdetails/32684
About us
Blitz is into Instant Logistics in Southeast Asia. Blitz was founded in the year 2021. It is in the business of delivering orders using EV bikes. Blitz not only delivers instant orders through EV Bikes, but it also finances the EV bikes to the drivers on lease and generates another source of revenue from the leasing as well apart from delivery charges. Blitz is revolutionizing instant coordination with the help of advanced technology-based solutions. It is a product-driven company and uses modern technologies to build products that solve problems in EV-based Logistics. Blitz is utilizing data sources coming from the EV bikes through IOT and smart engines to make technology-driven decisions to create a delightful experience for consumers
About the Role
We are seeking an experienced Data Engineer to join our dynamic team. The Data Engineer will be responsible for designing, developing, and maintaining scalable data pipelines and infrastructure to support our data-driven initiatives. The ideal candidate will have a strong background in software engineering, database management, and data architecture, with a passion for building robust and efficient data systems
What you will do
- Design, build, and maintain scalable data pipelines and infrastructure to ingest, process, and analyze large volumes of structured and unstructured data.
- Collaborate with cross-functional teams to understand data requirements and develop solutions to meet business needs.
- Optimise data processing and storage solutions for performance, reliability, and cost-effectiveness.
- Implement data quality and validation processes to ensure accuracy and consistency of data.
- Monitor and troubleshoot data pipelines to identify and resolve issues in time.
- Stay updated on emerging technologies and best practices in data engineering and recommend innovations to enhance our data infrastructure.
- Document data pipelines, workflows, and infrastructure to facilitate knowledge sharing and ensure maintainability.
- Create Data Dashboards from the datasets to visualize different data requirements
What we need
- Bachelor's degree or higher in Computer Science, Engineering, or a related field.
- Proven experience as a Data Engineer or similar role, with expertise in building and maintaining data pipelines and infrastructure.
- Proficiency in programming languages such as Python, Java, or Scala.
- Strong knowledge of database systems (e.g., SQL, NoSQL, BigQuery) and data warehousing concepts.
- Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform.
- Familiarity with data processing frameworks and tools (e.g., Apache, Spark, Hadoop, Kafka).
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
Preferred Qualifications
- Advanced degree in Computer Science, Engineering, or related field.
- Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes).
- Knowledge of machine learning and data analytics concepts.
- Experience with DevOps practices and tools.
- Certifications in relevant technologies (e.g., AWS Certified Big Data Specialty, Google Professional Data Engineer).
Please refer to the Company’s website - https://rideblitz.com/
Key Responsibilities:
• Design, develop, support, and maintain automated business intelligence products using
Tableau.
• Good understanding of data warehousing and data management concepts including OLTP,
OLAP, Dimensional Modelling, star/snowflake schemas
• Rapidly design, develop, and implement reporting applications that embed KPI metrics and
actionable insights into the operational, tactical, and strategic activities of key business
functions.
• Identify business requirements, design processes leveraging/adapting the business logic, and
regularly communicate with business stakeholders to ensure delivery meets business needs.
• Design and code business intelligence projects using Tableau, ensuring best practices for data
visualization and implementation.
• Develop and maintain dashboards and data sources that meet and exceed customer
requirements.
• Utilize Python for data manipulation, automation, and integration tasks to support Tableau
development.
• Write and optimize SQL queries for data extraction, transformation, and loading processes.
• Partner with business information architects to understand the business use cases
supporting and fulfilling business and data strategy.
• Collaborate with Product Owners and cross-functional teams in an agile environment.
• Provide expertise and best practices for data visualization and Tableau implementations.
• Work alongside solution architects in RFI/RFP response solution design, customer
presentations, demonstrations, POCs, etc., for growth.
• Understanding of Project life cycle and quality processes.
Qualifications:
• 5+ years of experience in Tableau development; Tableau certification is highly preferred.
• Proficiency in Python for data manipulation, automation, and integration tasks.
• Strong understanding and experience with SQL for database management and query
optimization.
• Ability to independently learn new technologies and show initiative.
• Demonstrated ability to work independently with minimal direction.
• Desire to stay current with industry technologies and standards.
• Strong presentation skills – ability to simplify complex situations and ideas into compelling
and effective written and oral presentations.
• Quick learner – ability to understand and rapidly comprehend new areas, both functional
and technical, and apply detailed and critical thinking to customer solutions.
🚀 We're Hiring! 🚀
opportunity 🌟
Looking for Backend + Flutter Developers proficient in Node, MongoDB, SQL, Google Cloud, Flutter, MVVM, Provider. Interested?
Fill this Google form for this Opportunity 🔥
https://forms.gle/1hLJ5g55TtvyJwnG8
salary = 5k to 20k
Remote work
Join WhatsApp group for furthur hiring task
Given in Google form.
Join Our Journey
Jules develops an amazing end-to-end solution for recycled materials traders, importers and exporters. Which means a looooot of internal, structured data to play with in order to provide reporting, alerting and insights to end-users. With about 200 tables, covering all business processes from order management, to payments including logistics, hedging and claims, the wealth the data entered in Jules can unlock is massive.
After working on a simple stack made of PostGres, SQL queries and a visualization solution, the company is now ready to set-up its data stack and only misses you. We are thinking DBT, Redshift or Snowlake, Five Tran, Metabase or Luzmo etc. We also have an AI team already playing around text driven data interaction.
As a Data Engineer at Jules AI, your duties will involve both data engineering and product analytics, enhancing our data ecosystem. You will collaborate with cross-functional teams to design, develop, and sustain data pipelines, and conduct detailed analyses to generate actionable insights.
Roles And Responsibilities:
- Work with stakeholders to determine data needs, and design and build scalable data pipelines.
- Develop and sustain ELT processes to guarantee timely and precise data availability for analytical purposes.
- Construct and oversee large-scale data pipelines that collect data from various sources.
- Expand and refine our DBT setup for data transformation.
- Engage with our data platform team to address customer issues.
- Apply your advanced SQL and big data expertise to develop innovative data solutions.
- Enhance and debug existing data pipelines for improved performance and reliability.
- Generate and update dashboards and reports to share analytical results with stakeholders.
- Implement data quality controls and validation procedures to maintain data accuracy and integrity.
- Work with various teams to incorporate analytics into product development efforts.
- Use technologies like Snowflake, DBT, and Fivetran effectively.
Mandatory Qualifications:
- Hold a Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
- Possess at least 4 years of experience in Data Engineering, ETL Building, database management, and Data Warehousing.
- Demonstrated expertise as an Analytics Engineer or in a similar role.
- Proficient in SQL, a scripting language (Python), and a data visualization tool.
- Mandatory experience in working with DBT.
- Experience in working with Airflow, and cloud platforms like AWS, GCP, or Snowflake.
- Deep knowledge of ETL/ELT patterns.
- Require at least 1 year of experience in building Data pipelines and leading data warehouse projects.
- Experienced in mentoring data professionals across all levels, from junior to senior.
- Proven track record in establishing new data engineering processes and navigating through ambiguity.
- Preferred Skills: Knowledge of Snowflake and reverse ETL tools is advantageous.
Grow, Develop, and Thrive With Us
- Global Collaboration: Work with a dynamic team that’s making an impact across the globe, in the recycling industry and beyond. We have customers in India, Singapore, United-States, Mexico, Germany, France and more
- Professional Growth: a highway toward setting-up a great data team and evolve into a leader
- Flexible Work Environment: Competitive compensation, performance-based rewards, health benefits, paid time off, and flexible working hours to support your well-being.
Apply to us directly : https://nyteco.keka.com/careers/jobdetails/41442
About the Role:
We are on the lookout for a dynamic Marketing Automation and Data Analytics Specialist, someone who is not only adept in marketing automation/operation but also possesses a keen expertise in data analytics and visualization. This role is tailor-made for individuals who are proficient with tools like Eloqua, Marketo, Salesforce Pardot, and Power BI.
As our Marketing Automation and Data Analytics Specialist, your responsibilities will span across managing and optimizing marketing automation systems and overseeing the migration and enhancement of data systems and dashboards. You will play a pivotal role in blending marketing strategies with data analytics, ensuring the creation of visually appealing and effective reports and dashboards. Collaborating closely with marketing teams, you will help in making data-driven decisions that propel the company forward.
We believe in fostering an environment where initiative and self-direction are valued. While you will receive the necessary guidance and support, the autonomy of your role is a testament to our trust in your abilities and professionalism.
Responsibilities:
- Manage and optimize marketing automation systems (Eloqua, Marketo, Salesforce Pardot) to map and improve business processes.
- Develop, audit, and enhance data systems, ensuring accuracy and efficiency in marketing efforts.
- Build and migrate interactive, visually appealing dashboards and reports.
- Develop and maintain reporting and analytics for marketing efforts, database health, lead scoring, and dashboard performance.
- Handle technical aspects of key marketing systems and integrate them with data visualization tools like Power BI.
- Review and improve existing SQL data sources for effective integration and analytics.
- Collaborate closely with sales, marketing, and analytics teams to define requirements, establish best practices, and ensure successful outcomes.
- Ensure all marketing data, dashboards, and reports are accurate and effectively meet business needs.
Ideal Candidate Qualities:
- Strong commitment to the role with a focus on long-term growth.
- Exceptional communication and collaboration skills across diverse teams.
- High degree of autonomy and ability to work effectively without micromanagement.
- Strong attention to detail and organization skills.
Qualifications:
- Hands-on experience with marketing automation systems and data analytics tools like Eloqua, Marketo, Salesforce Pardot and Power Bi .
- Proven experience in data visualization and dashboard creation using Power BI.
- Experience with SQL, including building and optimizing queries.
- Knowledge of ABM and Intent Signaling technologies is a plus.
- Outstanding analytical skills with an ability to work with complex datasets.
- Familiarity with data collection, cleaning, and transformation processes.
Benefits:
- Work-from-home flexibility.
- Career advancement opportunities and professional development support.
- Supportive and collaborative team environment.
Hiring Process:
The hiring process at InEvolution is thoughtfully designed to ensure alignment between your career goals and our company's objectives. The process will include:
- Initial Phone Screening: A brief conversation to discuss your background and understand your career aspirations.
- Team Introduction Interview: Candidates who excel in the first round will engage in a deeper discussion with our team, providing insights into our work culture and the specificities of the role.
- Technical Assessment: In the final round, you will meet our Technical Director for an in-depth conversation about your technical skills and how these align with the demands of the role.
About InEvolution
Founded in 2009, InEvolution stands as a beacon of excellence in providing back-office, operations, and customer support services globally. Our team, comprising highly skilled professionals, is committed to delivering top-notch quality services while ensuring cost efficiency. At InEvolution, we value innovation, quality, and our team's growth and development.
About the Role
- Work on building, processing and transferring data and dashboards from existing Domo Platform to Power BI Platform.
- Audit existing data systems and deployments and identify errors or areas for improvement.
- Utilize Power BI to build interactive and visually appealing dashboards and reports.
- Build Data Documentation and explanation on parameters, filters, models and relationships used in the dashboards.
- Review existing SQL data sources to improve, connect and integrate it effortlessly with Power BI.
- Create, test and deploy Power BI scripts, as well as execute efficient migration practices.
- Work closely with the current analytics team, to define requirements, migration steps and, have an open and transparent communication with the team on reviewing the migrated reports and data sources for successful outcomes.
- Ensure all dashboards and data sources are thoroughly reviewed by the team before publishing to the production environment.
- Convert business needs into technical specifications and establish a timeline for job completion.
Requirements & Skills:
- 2+ years of experience in using Power BI to run DAX queries and other advanced interactive functions.
- 2+ years of experience with Data Analysis and Data Visualization tools.
- 1+ years of experience working with Relational Databases and building SQL queries.
- Familiarity with Data Collection, Cleaning and Transformation processes.
- Attention to detail and the ability to work with complex datasets.
at Optisol Business Solutions Pvt Ltd
Role Summary
As a Data Engineer, you will be an integral part of our Data Engineering team supporting an event-driven server less data engineering pipeline on AWS cloud, responsible for assisting in the end-to-end analysis, development & maintenance of data pipelines and systems (DataOps). You will work closely with fellow data engineers & production support to ensure the availability and reliability of data for analytics and business intelligence purposes.
Requirements:
· Around 4 years of working experience in data warehousing / BI system.
· Strong hands-on experience with Snowflake AND strong programming skills in Python
· Strong hands-on SQL skills
· Knowledge with any of the cloud databases such as Snowflake,Redshift,Google BigQuery,RDS,etc.
· Knowledge on debt for cloud databases
· AWS Services such as SNS, SQS, ECS, Docker, Kinesis & Lambda functions
· Solid understanding of ETL processes, and data warehousing concepts
· Familiarity with version control systems (e.g., Git/bit bucket, etc.) and collaborative development practices in an agile framework
· Experience with scrum methodologies
· Infrastructure build tools such as CFT / Terraform is a plus.
· Knowledge on Denodo, data cataloguing tools & data quality mechanisms is a plus.
· Strong team player with good communication skills.
Overview Optisol Business Solutions
OptiSol was named on this year's Best Companies to Work for list by Great place to work. We are a team of about 500+ Agile employees with a development center in India and global offices in the US, UK (United Kingdom), Australia, Ireland, Sweden, and Dubai. 16+ years of joyful journey and we have built about 500+ digital solutions. We have 200+ happy and satisfied clients across 24 countries.
Benefits, working with Optisol
· Great Learning & Development program
· Flextime, Work-at-Home & Hybrid Options
· A knowledgeable, high-achieving, experienced & fun team.
· Spot Awards & Recognition.
· The chance to be a part of next success story.
· A competitive base salary.
More Than Just a Job, We Offer an Opportunity To Grow. Are you the one, who looks out to Build your Future & Build your Dream? We have the Job for you, to make your dream comes true.
Job Description:
We are looking for an experienced SQL Developer to become a valued member of our dynamic team. In the role of SQL Developer, you will be tasked with creating top-notch database solutions, fine-tuning SQL databases, and providing support for our applications and systems. Your proficiency in SQL database design, development, and optimization will be instrumental in delivering efficient and dependable solutions to fulfil our business requirements.
Responsibilities:
● Create high-quality database solutions that align with the organization's requirements and standards.
● Design, manage, and fine-tune SQL databases, queries, and procedures to achieve optimal performance and scalability.
● Collaborate on the development of DBT pipelines to facilitate data transformation and modelling within our data warehouse.
● Evaluate and interpret ongoing business report requirements, gaining a clear understanding of the data necessary for insightful reporting.
● Conduct research to gather the essential data for constructing relevant and valuable reporting materials for stakeholders.
● Analyse existing SQL queries to identify areas for performance enhancements, implementing optimizations for greater efficiency.
● Propose new queries to extract meaningful insights from the data and enhance reporting capabilities.
● Develop procedures and scripts to ensure smooth data migration between systems, safeguarding data integrity.
● Deliver timely management reports on a scheduled basis to support decision-making processes.
● Investigate exceptions related to asset movements to maintain accurate and dependable data records.
Duties and Responsibilities:
● A minimum of 3 years of hands-on experience in SQL development and administration, showcasing a strong proficiency in database management.
● A solid grasp of SQL database design, development, and optimization techniques.
● A Bachelor's degree in Computer Science, Information Technology, or a related field.
● An excellent understanding of DBT (Data Build Tool) and its practical application in data transformation and modelling.
● Proficiency in either Python or JavaScript, as these are commonly utilized for data-related tasks.
● Familiarity with NoSQL databases and their practical application in specific scenarios.
● Demonstrated commitment and pride in your work, with a focus on contributing to the company's overall success.
● Strong problem-solving skills and the ability to collaborate effectively within a team environment.
● Excellent interpersonal and communication skills that facilitate productive collaboration with colleagues and stakeholders.
● Familiarity with Agile development methodologies and tools that promote efficient project management and teamwork.
MySQL Database Developer
Strong development knowledge in DB Design & development with 6+ Years’ to 10 Years experience (MySQL DB) – Mandatory
Strong hands on writing complex PGSQL, procedure and Functions & prevent blocking and Deadlocks
Conduct SQL objects code review & Performance tuning(Mandatory)
having hands on Microsoft SQL and Postgres DB is anadvantage.
Strong knowledge in RDBMS and NoSQL Concept with strong logical thinking and solutions
Expert in transaction databases (OLTP) and ACID property with handling large scale application databases
For a Product based Co in Helathcare
Hi ,
Edu: BE/B.tech/MCA
Notice Period : Immediate - 15 days
Permanent remote
Job Description :
Mandatory Skills : SQL Server & Database Architecture.
Description
• 10 years of database experience.
• 3-4 years previous experience as a database architect or in a data management role.
• Provide development and administration support for various databases, primarily SQL server
databases, in a client-server development environment including stored procedures and
SQL, and index performance tuning, and database design and file storage.
• Solid knowledge of relational database architecture and concepts.
• Excellent multitasking skills and task management strategies.
• Confident in decision making and the ability to explain processes.
• Develop, deploy, and maintain Extract-Transform-Load (ETL) scripts and schedules.
• Develop and implement all database indexing, maintenance, back-up, transformations, and
stored procedures.
• Assist Engineering Team in implementing appropriate data access permission schemas.
• Participate in the refinement and development of user stories, assessing the impact of new
development on current database implementation.
• Communicate with technical and product owner stakeholders to help them anticipate and
mitigate configuration management issues.
• Knowledge of developing analytic reports and dashboards using Microsoft Power BI or other
similar tools
• Excellent command of the SQL language with the ability to write and optimize complex
queries.
• Achieves organizational goals by defining, integrating, and upgrading a comprehensive
architecture to support applications, platforms, and databases.
• Maintains database by determining structural requirements and developing and installing
solutions.
• Ensures security of all information and computer systems and digital data
• Compare and analyze provided statistical information to identify patterns, relationships, and
problems.
• Ability to perform root cause analysis and produce the mitigation plan for the issues and
risks identified.
• Strong technical documentation skills, with requisite attention to detail.
We are looking for a Full Stack Developer-REMOTE Position
Strong knowledge of HTML, CSS, JavaScript, and JavaScript frameworks (e.g., React, Angular, Vue)
• Experience with backend technologies such as Node.js, DynamoDB, and PostgreSQL.
• Experience with version control systems (e.g., Git)
• Knowledge of web standards and accessibility guidelines
• Familiarity with server-side rendering and SEO best practices
Experience with AWS services, including AWS Lambda, API Gateway, and DynamoDB.
• Experience with Dev Ops- Infrastructure as Code, CI / CD, Test & Deployment Automation.
• Experience writing and maintaining a test suite throughout a project's lifecycle.
• Familiarity with Web Accessibility standards and technology
• Experience architecting and building Graph QL APIs and REST-full services.
Hi,
We need a fullstack developer who can write quality code real fast as we are fast-paced startup
Roles and Responsibilities
- Backend development in Python/Flask
- Frontend development in React/Next
- Deployment using AWS
You will learn a lot on the job so we need someone who is willing to learn and put in the work
at Simform
Company Description:
Simform is an innovative product engineering company that assists organizations of any size to identify and solve key business challenges with DevOps, cloud-native development, and quality engineering services. Founded in 2010, our agile remote teams of engineers immerse themselves in your project, maintain your company culture, and work in line with your strategic goals. At Simform, we are dedicated to developing competitiveness and agility for companies using software.
Role Description:
This is a full-time hybrid role for a Sr. Python/Cypress Automation Engineer located in Ahmedabad, India, with flexibility for some remote work. The Sr. Python/Cypress Automation Engineer will be responsible for developing, testing, and maintaining a scalable automation framework for web and mobile applications. The Sr. Python/Cypress Automation Engineer will also work closely with cross-functional teams to identify and resolve issues, and collaborate with other QA engineers to ensure high-quality solutions.
Qualifications:
- Bachelor's degree in Computer Science or a related field with 4+ years of relevant work experience
- Strong proficiency in Python with excellent understanding of: Python unit testing frameworks (PyTest, Unittest), web-scraping (bs4, lxml), and good-to-have modules (requests, lxml, pandas, numpy, etc.)
- Good Knowledge in testing frameworks such as Cypress, Selenium, Appium, and TestNG
- Experience in End-to-End testing, Integration testing, Regression testing, and API testing
- Demonstrated experience with test reporting tools such as Allure, ExtentReports, and Cucumber
- Excellent understanding of CI/CD pipeline, build automation, and deployment pipelines
- Experience in software development methodologies such as Agile, Scrum and Kanban
- Expertise in SQL and other databases like DynamoDB, MongoDB, etc.
- An analytical mind with keen problem-solving skills and attention to detail
- Excellent verbal and written communication skills in English
Why Simform?-
- Flat-hierarchical, friendly, engineering-oriented, and growth-focused culture.
- Flexible work timing, 18+12 leaves, Leaves for life events, Flexibility
- Free health insurance
- Office facility with large fully-equipped game zone, in-office kitchen with affordable lunch service, and free snacks.
- Sponsorship for certifications/events, library service, and the latest cutting-edge tech learning.
DevDarshan is a devotional platform launched by IIT graduates to promote the teachings of Indian culture and the Hindu way of life in India around the world. In the 21st century, where everything around is digitized then why not temples. That’s the idea behind DevDarshan.We’ve built a community of devotees from multiple Countries, through our Mobile Application that connects Temples and Devotees, have successfully raised seed investment and also started to generate revenue for the temples and Priests associated with us. Right now we are looking to grow our team and build new exciting features for devotees all around the world.
This is where you come in.
We are looking for a passionate and self-motivated individual to help design our backend Systems to support both the Mobile App and WebApp
Requirements:
- Experience in NodeJS, Typescript, ExpressJS AWS EC2. You have built backend REST API’s
- Expert in System Design and Software Architecture Processes, How different components interact with each other in scale
- Experience with DevOps, Docker, AWS, Google Cloud.
- Experience in Managing Development Teams, complete delivery lifecycle
- Good understanding and experience of NoSQL and SQL Databases, which to be used when.
- Experience with CI/CD Systems like Jenkins, Github Actions.
- Some Experience with Realtime Databases/Systems or Socket based applications would be preferred.
- Some Experience with building Algorithms, Social Apps is preferred.
- Any experience with Handling Video Delivery like ffmpeg/HLS/WebRTC is preferred but not mandatory.
The Role
This Role naturally progresses into Engineering Manager / Software Architect.
- You will be involved at all stages of the product development process, from design to development and deployment.
- You will architect, build, scale, backend systems that powers our applications which will be used by millions of devotees every day.
- You possess a passion for improving techniques, processes, tracking, and continuously improve our engineering practices and would work on a daily basis towards that
at Gipfel & Schnell Consultings Pvt Ltd
Here are some technologies you may find useful for web development:
- Vue.js for the front-end framework
- Laravel for the back-end framework
- PHP 7+ for server-side scripting
- SQL for database management
- Docker for containerization purposes.
Role - Senior Analytics Executive
Experience - 1-2 years
Location - Open (Remote working option available)
About Company :-
Our team is made up of best in class digital, offline, and integrated media experts who work together to enhance media's contribution to Google's business. Our team operates in a seamlessly integrated way across strategy, planning, investment, creative, business sciences and analytics, data and technology. The profile of people who work are world class, committed to establishing a new high water mark in the media industry.
About the role/ Some of the things we'd like you to do:
- Support the Analytics team and other stakeholders with analytics agendas impacting campaigns, measurement frameworks, and campaign optimization.
- Conduct thorough data analysis using various tools and software within MFG to provide insights and recommendations that align with client needs.
- Collaborate with internal stakeholders across disciplines and countries to understand client objectives, providing support and expertise in data and analytics while identifying opportunities for advanced analytics solutions.
- Formulate strategic recommendations with the team based on gathered insights, and support the delivery of reporting for clients.
- Continuously improve project performance and processes by collaborating with the team to develop new tools and solutions.
About yourself/ Requirements:
- Bachelor's degree in a related quantitative field (e.g. Statistics, Business Analytics, Economics, Computer Science, etc.)
- 1-2 years of relevant work experience in data analysis; digital media experience desired
- Strong knowledge of various data analysis tools and software (e.g., Excel, SQL, R, Python, Tableau).
- Is proficient in statistical principles and how they apply to tasks/work items.
- Excellent problem-solving skills and the ability to analyze complex data sets.
- Strong communication and interpersonal skills, with the ability to present data-driven insights to both technical and non-technical audiences.
- Ability to work independently and as part of a team, with strong collaboration skills.
- Demonstrated ability to manage multiple projects and prioritize tasks effectively.
- Passion tor continuous learning and staying current with industry trends and best practices in analytics.
Lantern Pharma is looking for talented and highly motivated R Shiny based front end web application developers to visualize the data analyzed / reports generated using bioinformatics pipelines. Leveraging your background and applying software engineering principles, to develop applications and tools in support of data visualization and reporting using R / R Shiny. You will be involved in generating complex graphs used for statistical analysis and interpretation using interactive libraries such as Plotly. In addition to development of new code, you will support and maintain standard R programs, which will comprise a library of code to be used for future development. This is a 6 months contract position (40 hrs / week) with possibilities of full time employment based on performance.
RESPONSIBILITIES:
- Design and develop interactive web applications using R / R Shiny to visualize ML outputs, multi-omics data and reports generated by bioinformatics pipelines
- Use the open-source or proprietary R libraries to build useful applications and packages
- Write efficient, clean and well tested code that adheres to best practices
- Organize data, analysis, reports, and applications into reproducible pipelines
- Test and debug applications to ensure they are working correctly, using both manual and automated testing techniques
- Stay up-to-date with the latest R packages and developments in the pharma industry
- Publish R packages to internal company repository
- Deploy and maintain the R Shiny web-application on the AWS server
BASIC QUALIFICATIONS:
- Bachelor's degree required (Master’s degree preferred) in a scientific discipline such as Statistics, Computer Science, Bioinformatics etc.
- Minimum 2 years of R / R Shiny, R Markdown experience with standard packages for data visualization and analysis
- Proven ability of building interactive R-shiny application (demo of the R-shiny application)
- Proficient in libraries such as ggplot2, plotly, dplyr, Highcharter and DT
- Must have experience in generating and working with R markdown reports
- Experience of AWS to host R shiny application is a plus
- Knowledge of cancer biology, genomics and drug development is a plus
- Familiarity with JavaScript, HTML, CSS, Python a plus
- Experience in building a database using Relational / Non relational Database (SQL / NoSQL)
- Must be capable of developing collaborative relationships with colleagues
- Must be self-motivated and have strong analytical, interpersonal, and project management skills
- Must demonstrate the ability to present ideas, issues, and observations in a clear manner
Lantern provides multiple growth opportunities and as an early team member, your work will have a direct impact on precision oncology that can transform drug development. In addition to attractive compensation, we offer employees the opportunity for competitive health, dental & vision insurance, stock options in a public company, an opportunity to take leadership on new and meaningful projects, & involvement with leading conferences & industry trade shows.
Greetings from 63 moons technologies!!!
We are currently looking for Technical Support Engineer (capital market domain) for our organization. It is a Work From Home profile.
63 moons technologies ltd. (63MT) is a global leader in creating and operating technology-centric, next-generation financial markets that are transparent, efficient and liquid, across multi asset classes, including equities, commodities, currencies and bonds, among others. Its highly robust and scalable exchange and trading technology, coupled with deep domain expertise, gives it a decisive edge in driving mass disruptive innovation that is unmatched in financial markets. This uniquely positions 63MT as the creator of electronic, organized and regulated financial markets for new asset and investor classes that are either under-served or economically unviable to be served by traditional markets.
Job Location: Andheri East, Mumbai
Job Description:
- Candidate should have experience in Technical Support,
- Strong experience and knowledge of MS SQL,
- Experience in Networking, Issue handling
- Ability to provide network support
- Ability to work on web and windows based applications to provide technical support
- Ability to communicate well with clients and team
- Good Communication
- Good to have Experience in AWS and Linux.
- Should be able to work in Rotational shifts (General, Afternoon and Night)*
Name:
Company:
Designation:
Current CTC:
Expected CTC:
Notice period:
Overview of the role
As a Node Developer you will be collaborate with cross-functional teams to define, design and build performant modern web applications and services. Build high-quality web applications and services by writing clean and modular code.
Must have skills
- Overall 3-5 Years of experience in writing unit and integration tests to ensure the robustness and reliability of web applications and services.
- Measure and improve the performance of microservices. Catalyse growth within the team through code reviews and pair programming to maintain high development standards
- Investigate operational issues to find the root cause and propose improvements Design, build, and maintain APIs, services, and systems across Stripe’s engineering teams.
- Expert level of experience in the design and development of Web Applications, and highly scalable distributed systems.
- Should have experience in development skills using the latest technologies including NodeJS(fastify framework), Microservices, PostgreSQL, Redis, etc. Should have exposure to NoSQL and SQL development.
- Comprehensive knowledge of physical and logical data modeling, and performance tuning.
- Should possess excellent communication, presentation, and interpersonal skills.
- Ability to work collaboratively and productively with globally dispersed teams.
- Build high-performance teams and Coach teams for successful career growth.
- Experience working with relational and non-relational databases, query optimization, and designing schema
Desired Background
Bachelors/Masters Degree in Computer Science, Information Technology
Odoo Technical Consultant
Job Description:
Strong knowledge of Python and programming concepts.
Complete understanding of Odoo basic flow.
Knowledge of data models available in Odoo core.
Proven expertise in developing custom modules in Odoo.
Odoo Techno functional knowledge.
Experience in developing latest versions of Odoo and Excellent debugging skills in Odoo.
Experience in migrating from earlier Odoo version to new version.
Core knowledge about the current feature available in Odoo – Sales, Purchase, CRM, Accounts, Inventory, Projects, Timesheet, HR etc.
Good knowledge of PostgreSQL with ability to write SQL queries.
View customization – work on Widgets, Wizards, Java Script, view XML etc.
Q-Web reports creation.
Knowledge of Version Control System like GitHub.
Required Skill Set-
Project experience in any of the following - Data Management,
Database Development, Data Migration or Data Warehousing.
• Expertise in SQL, PL/SQL.
Role and Responsibilities -
• Work on a complex data management program for multi-billion dollar
customer
• Work on customer projects related to data migration, data
integration
•No Troubleshooting
• Execution of data pipelines, perform QA, project documentation for
project deliverables
• Perform data profiling, data cleansing, data analysis for migration
data
• Participate and contribute in project meeting
• Experience in data manipulation using Python preferred
• Proficient in using Excel, PowerPoint
-Perform other tasks as per project requirements.
Location: Hires Remotely Everywhere
Job Type: Full Time
Experience: 3+ years
Salary: INR 7-10 LPA
Key Responsibilities:
- Strong automation QA individual, who can finalize, design and build automation framework from scratch
- Advanced programming skills including automation systems and databases
- Strong knowledge of software QA methodologies, PM tools, and QA
- Familiar with programming script languages including HTML, CSS, Python,
- Hands-on experience with both white box and black box testing
- Expertise in automation tools like Selenium, Pytest, Behave and Cucumber
- Familiar with CI/CD tools like- Jenkins, TeamCity
- Familiar with version control systems like Bitbucket, GitHub
- Experience working in an agile/scrum development process
- Experience in writing clear, concise and comprehensive test plans and test cases
- Detailed knowledge of application functions, bug fixing and testing protocols
- Good written and verbal communication skills
- Excellent analytical skills
- Strong attention to detail
Nice to have:
- Nice to have someone who has very good knowledge of python programming
Ventura Securities is establishing its fintech team - a remote-first, work from anywhere team with highly talented individuals who come from diverse backgrounds and looking to solve real client problems at scale. Ventura has been in the stockbroking business for 20+ years and it is a robust and profitable franchise with 1000+ employees currently. We are looking for passionate techies with skills primarily around AWS and Python who are aspiring for a fast-track career.
Your Key Responsibilities:
1. Build out of the Ventura cloud-based backend platform from scratch
2. Ownership and monitoring of our D2C backend
3. Robust documentation skills and desire to share information with others
4. Desire and ability to prototype ideas quickly, and be willing to experiment and learn
Basic Requirements:
· Very strong python, lambda, sql, general aws
· Clean coding skills around Python or Go and SQL
· Demonstratable experience around writing testable code, working with git, doing peer-level code review, daily standups, and generally championing software excellence
· What you get:
1. Chance to build out a next-gen fintech product from ground 0
2. Opportunity to influence the design of the product
3. Flexible and work anywhere environment running out of Slack
Flat org structure
Job Description: Core Java Developer
Company: Mobile Programming LLC
Location: Pune (Remote work available)
Salary: Up to 16 LPA
Position: Core Java Developer
Responsibilities:
- Design, develop, and maintain Java-based applications using Core Java and Spring Boot frameworks.
- Collaborate with cross-functional teams to deliver high-quality software solutions.
- Conduct code reviews and troubleshoot/debug complex issues.
- Optimize application performance and stay updated with industry trends.
Requirements:
- Minimum 6 years of hands-on Core Java development experience.
- Strong proficiency in Spring Boot framework.
- Solid understanding of OOP principles, web development (HTML/CSS/JavaScript), RESTful web services, and SQL.
- Experience with Git and problem-solving skills.
- Excellent communication skills and ability to work independently and as part of a team.
- Bachelor's or Master's degree in Computer Science or a related field.
Note: Immediate joiners required. Please include your resume and relevant code samples/projects when applying.
It's regarding a permanent opening with Data Semantics
Data Semantics
We are Product base company and Microsoft Gold Partner
Data Semantics is an award-winning Data Science company with a vision to empower every organization to harness the full potential of its data assets. In order to achieve this, we provide Artificial Intelligence, Big Data and Data Warehousing solutions to enterprises across the globe. Data Semantics was listed as one of the top 20 Analytics companies by Silicon India 2018 & CIO Review India 2014 as one of the Top 20 BI companies. We are headquartered in Bangalore, India with our offices in 6 global locations including USA United Kingdom, Canada, United Arab Emirates (Dubai Abu Dhabi), and Mumbai. Our mission is to enable our people to learn the art of data management and visualization to help our customers make quick and smart decisions.
Our Services include:
Business Intelligence & Visualization
App and Data Modernization
Low Code Application Development
Artificial Intelligence
Internet of Things
Data Warehouse Modernization
Robotic Process Automation
Advanced Analytics
Our Products:
Sirius – World’s most agile conversational AI platform
Serina
Conversational Analytics
Contactless Attendance Management System
Company URL: https://datasemantics.co
JD:
MSBI
SSAS
SSRS
SSIS
Datawarehousing
SQL
Data Semantics
We are Product base company and Microsoft Partner
Data Semantics is an award-winning Data Science company with a vision to empower every organization to harness the full potential of its data assets. In order to achieve this, we provide Artificial Intelligence, Big Data and Data Warehousing solutions to enterprises across the globe. Data Semantics was listed as one of the top 20 Analytics companies by Silicon India 2018 & CIO Review India 2014 as one of the Top 20 BI companies. We are headquartered in Bangalore, India with our offices in 6 global locations including USA United Kingdom, Canada, United Arab Emirates (Dubai Abu Dhabi), and Mumbai. Our mission is to enable our people to learn the art of data management and visualization to help our customers make quick and smart decisions.
Our Services include:
Business Intelligence & Visualization
App and Data Modernization
Low Code Application Development
Artificial Intelligence
Internet of Things
Data Warehouse Modernization
Robotic Process Automation
Advanced Analytics
Our Products:
Sirius – World’s most agile conversational AI platform
Serina
Conversational Analytics
Contactless Attendance Management System
Company URL: https://datasemantics.co
JD:
.Net
.Net Core
Logic App
SQL
Regards,
Deepu Vijayan | Talent Acquisition
Job description
Copperchips is hiring full-stack spring boot developers with hands-on experience in the underneath
- Spring Boot
- AngularJs
- Spring Data jpa
- Hibernate
- SQL
- Postgres
- Rest API
Additionally, knowledge on the underneath technologies will be a plus
- Experience in Angular.Js is a big plus
- Cloud Environments preferably AWS. GCP / Azure would also work
- Knowledge of GIT/Bitbucket, and other version management tools is a must
- Queueing frameworks, Webhooks and Event Driven Development
- CI/CD implementation using Jenkins / bamboo
- You will be responsible for
- Designing/architecting application features/modules
- Develop, build, test, and deploy software product features by adopting industry best practices
- Preparing technical documentation. Translating BRDs into FRD's. Decomposing requirements into logical work breakdown structure
- Pro-actively propose solutions and strategies to business challenges
Skills for Success
- A quick learner with good written and oral communication skills
- Should be able to take new initiatives and ownership of tasks
- Must have the ability to work independently with little to no supervision
- Must be flexible to work in a multi-shift environment
- Candidate should be willing to travel onsite for short/long-term assignments
A global technology and cloud services company.
Company Overview
Established in 2014, they are a global technology services company that delivers Digital Transformation, Cloud, Data and Insights, Cybersecurity, and Strategic Staffing solutions that improve customers’ businesses and bottom line. They are a team of over 400 people, headquartered in the US and one of the operating countries is India.
Location: Remote
Skills: .Net core, C#, MVC, Angular 6 or above version, SQL
Experience – 5 to 10 Years
Max CTC for SR .net full stack developer: 20-24 LPA Based on the experience levels.
About Us
At Toddle, we are on a mission to help schools around the world deliver better learning. We are a team of 2x founders – before founding Toddle, we set up a network of preschools and schools in India. We continue to operate these preschools and schools as it gives us a unique vantage
point into the challenges faced by schools.
Our belief is that real impact in education can only be achieved by improving schools. That’s because children spend more than 80% of their learning time in school and schools are here to stay as schools are not just about teaching & learning, they also play a pivotal social role in our lives. So while most Edtech companies are trying to solve for education by working outside the school system, we think that the focus needs to be on taking our schools to the next level.
Our first product is a collaboration platform for teaching teams – we say that Toddle is to teaching, what Figma is to design. In its current version, Toddle empowers teachers to work together and better across all stages of the teaching-learning process – right from curriculum
planning and assessments to progress-tracking and family communication.
Here are some key facts about us:
1. Currently used by 1,000+ schools and 30,000+ teachers from 107 countries
2. Global revenue – 40% from Americas, 30% from EMEA, 30% from APAC
3. NRR > 115%
4. NPS > 70
5. Annual churn < 5%
6. Inbound led GTM motion fuelled by positive word of mouth
We love solving for
1. Creating delightful experiences that help teaching teams collaborate better & be more effective
2. Building tools that help teachers personalise learning for their students
3. Helping teachers around the world connect with each other meaningfully
We are at a series A stage right now, growing fast in the SAAS Ed-tech space, and have been funded by Sequoia, Matrix, Beenext, and Better Capital among others.
Here’s what you’ll be doing
We are looking for a passionate and skilled Backend engineer, one who partners with us in creating beautiful products for teachers, students and parents alike. As a Backend engineer, you will play a major role in designing, developing and deploying high-quality web platforms. In
this multifaceted role, you will get the opportunity to work along curriculum experts, teachers, and students, and user-test the product in real school settings.
Required Skills:
Frameworks & Technologies: Node.js, Express, GraphQL, JWT
Database: SQL query writing and optimisation.
Tools: Git basics, Scripting basics
Good to have:
DevOps experience with AWS, CI/CD
Working experience in PostgreSQL
Is this someone that looks like you?
• Education: B.E/B.Tech degree
• Experience: 1 to 3 years of relevant working experience (SaaS preferred)
• Soft Skills: Having a bias towards action, a good sense of design, empathy, and good communication skills
• We deeply value building the right culture at Toddle, and these are a few things that we look for in each hire - Coach-ability, Curiosity, Ownership, Hustle and Humility
Excited about the role?
Here are a few more benefits:
1. Work from anywhere - Work from where you like - home, co-working space, cafe or the hills
2. Exposure to diverse learning opportunities
Work across different projects & teams to develop skills outside of your core expertise
Access to a small budget towards learning (e.g. books, online courses, substacks)
3. Uncapped leave policy
We trust you fully on your commitment to our mission and your judgement on planning your time and taking leaves:
No cap on the number of sick or casual leaves
Special paid leaves for childbirth, wedding etc.
And if in any circumstance you feel burnt out - there is a little something for this too!
No questions asked menstrual leave
4. Flexible working hours - Block “no-meeting hours” to enable uninterrupted focused work
5. No bell curve performance evaluations
6. And yes, a super fun and smart group of folks to collaborate and grow with.
Still with us? Here’s the process:
1. Shortlisting:
• Apply for the role.
• We’ll check out your application and assess it for a fit
• You will hear back from us within 2-3 days of the submission if shortlisted
2. Once shortlisted, we usually do about 2-3 rounds of interviews, a written assignment, and deep reference checks.
3. We value everyone’s time hence we make it a point to communicate proactively at all stages of your application. You can always reach out to us for any questions.
at Cybrilla Technologies
Who we are?
Cybrilla is a financial infrastructure company. We are building the frontend applications & APIs from the ground up to support a variety of use cases and enable a superior digital experience for different user personas. Cybrilla's current focus is to build the underlying operating system for mutual funds.
AMCs / Fintech startups / Wealth management businesses can choose the product that works best for them to enable an awesome experience for their customers and internal stakeholders.
About the team
FPApp is a hosted UI platform to build, operate and invest in Indian wealth management instruments. Our vision is to create products for all user personas in the ecosystem, be it an investor, distributor, business team, operation, or support.
We are looking for minds that have a knack for understanding user problems, simplifying investment journeys and building meaningful products!
Responsibilities
- Own implementation of frontend functional workflows and applications for Fintech ecosystem.
- Analyse problem statements and design technical solutions which may involve building client-side API integrations or server side business logic.
- Ensure enterprise level scalability, reliability and security standards for front end workflows and applications.
- Continuously evolve ways of designing for better user experience and lead the implementation of same.
Requirements
Technical Skills
Frontend
- Hands-on experience on Typescript is a must
- Deep Expertise in Javascript - including manipulating DOM, acquaintance with newer specifications of ECMAScript
- Understanding of Javascript Tooling - Build Tools, Lint Tools, Tree Shaking. Must have basic understanding of Babel, Webpack etc.,
- Hands on Experience in breaking complex UI Components into small reusable components
- Hands on Experience in using state management using Redux and understanding of React Context
- Understanding of Patterns in Javascript
Backend
- Strong understanding of SQL.
- Hands-on experience in Node.js Applications.
- Experience with Ruby on Rails, Java would be a value add.
- Extensive background in designing, developing, testing, deploying, maintaining, and improving software.
General
- 8+ years of experience in building web and responsive apps in a customer-facing or B2B SaaS environment
- Interest in engaging with latest technologies and evaluating strategies to keep technology stack up to date
- Experience mentoring junior engineers in all aspects of planning, development and testing
- Demonstrable experience architecting scalable and cost-effective technical solutions to solve business goals
- Understanding of Git - Merging, Basing etc.,
- Experience with CI/CD pipelines - one of Github Workflows, Gitlab CI or anything else
- Experience with hosting in any of the cloud hosting providers
What’s in it for you?
- Opportunity to work on a growing product that solves unique digital use cases for the Indian wealth management industry.
- Flexible work options.
- Opportunity to solve deep domain and technology problems, for very large enterprises.
- Opportunity to work with the best brains in fintech.
- Increasing your geek quotient, by attending meetups and conferences. Yes we dig that stuff.
- Grow exponentially by working in small and transparent teams.
- Who says you can’t make friends at work, we do!
Job Title: Senior Data Engineer
Experience: 8Yrs to 11Yrs
Location: Remote
Notice: Immediate or Max 1Month
Role: Permanent Role
Skill set: Google Cloud Platform, Big Query, Java, Python Programming Language, Airflow, Data flow, Apache Beam.
Experience required:
5 years of experience in software design and development with 4 years of experience in the data engineering field is preferred.
2 years of Hands-on experience in GCP cloud data implementation suites such as Big Query, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage, etc.
Strong experience and understanding of very large-scale data architecture, solutions, and operationalization of data warehouses, data lakes, and analytics platforms.
Mandatory 1 year of software development skills using Java or Python.
Extensive hands-on experience working with data using SQL and Python.
Must Have: GCP, Big Query, Airflow, Data flow, Python, Java.
GCP knowledge must
Java as programming language(preferred)
Big Query, Pub-Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage,
Python
Communication should be good.
at Gipfel & Schnell Consultings Pvt Ltd
Full stack software developer who enjoys solving complex problems
• Solid experience in.NET Core, SQL Server, and REACT (including REST)
• Experience in building cloud-native applications (Azure)
• Must be skilled at writing scalable, maintainable code
• Must have the ability to independently envision solutions and write clean code
• 5+ years of defining and implementing Application Architecture.
• Demonstrated experience with the .NET ecosystem (.NET Framework, ASP.NET, .NET Core)
• Demonstrated experience with front-end React, HTML and CSS Frameworks
• Experience building modern, scalable, reliable applications on the MS Azure cloud including services such as:
▪ App Services
▪ Azure Service Bus/ Event Hubs
▪ Azure API Management Service
▪ Azure Bot Service
▪ Azure Cognitive Services
▪ Function/Logic Apps
▪ Azure key vault & Azure Configuration Service
▪ CosmosDB
▪ Azure Search
▪ Azure Bot Framework
▪ Azure Cognitive Services
• Experience with highly available and large-scale cloud deployments.
• Extensive knowledge and experience with Enterprise Level architectural concepts and frameworks.
• Broad knowledge of Agile methodologies and best practices such as SCRUM, Kanban and Continuous Integration.