

Similar jobs

Hiring For SDE II - Python (Remote)
The Impact you will create:
-
Build campaign generation services which can send app notifications at a speed of 10 million a minute
-
Dashboards to show Real time key performance indicators to clients
-
Develop complex user segmentation engines which creates segments on Terabytes of data within few seconds
-
Building highly available & horizontally scalable platform services for ever growing data
-
Use cloud based services like AWS Lambda for blazing fast throughput & auto scalability
-
Work on complex analytics on terabytes of data like building Cohorts, Funnels, User path analysis, Recency Frequency & Monetary analysis at blazing speed
-
You will build backend services and APIs to create scalable engineering systems.
-
As an individual contributor, you will tackle some of our broadest technical challenges that requires deep technical knowledge, hands-on software development and seamless collaboration with all functions.
-
You will envision and develop features that are highly reliable and fault tolerant to deliver a superior customer experience.
-
Collaborating various highly-functional teams in the company to meet deliverables throughout the software development lifecycle.
-
Identify and improvise areas of improvement through data insights and research.
Primary Responsibilities
-
End-to-end ownership of product development, from design, through implementation, testing, deployment, and maintenance
-
Translating high-level requirements and end-user use cases into design proposals, decomposing complex features into smaller, short-term deliverable tasks
-
Maintaining constant focus on scalability, performance and robustness of architecture
-
Designing and implementing logging, monitoring and alerting systems for existing and new infrastructure
-
Documenting API's and architecture design
-
Mentor and guide juniors on their path to become solid developers
What we look for?
-
4+ of industry experience in technical leadership roles
-
Solid knowledge of Python, SQL, NoSQL, shell scripting and Linux operating environment
-
End-to-end experience in design and development of highly scalable enterprise and cloud data products
-
Ability to challenge and redefine existing architecture to create robust, scalable and reliable products
-
Hands-on experience with design and troubleshooting of scalable web services, queue based systems, distributed databases and streaming services
-
Experience with modern DevOps technologies such as kOps, Kubernetes and Docker, CI/CD, monitoring and autoscaling

Immediate Joiners Preferred. Notice Period - Immediate to 30 Days
Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".
Only applications received via email will be reviewed. Applications through other channels will not be considered.
About Us
adesso India is a dynamic and innovative IT Services and Consulting company based in Kochi. We are committed to delivering cutting-edge solutions that make a meaningful impact on our clients. As we continue to expand our development team, we are seeking a talented and motivated Backend Developer to join us in creating scalable and high-performance backend systems.
Job Description
We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.
Responsibilities
- Development and maintenance of data pipelines and automation scripts with Python
- Creation of data queries and optimization of database processes with SQL
- Use of bash scripts for system administration, automation and deployment processes
- Database and cloud technologies
- Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake)
- Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular
- Composer (Airflow): Orchestration of data pipelines for ETL processes
- Cloud Functions: Development of serverless functions for data processing and automation
- Cloud Scheduler: Planning and automation of recurring cloud jobs
- Cloud Secret Manager: Secure storage and management of sensitive access data and API keys
- BigQuery: Processing, analyzing and querying large amounts of data in the cloud
- Cloud Storage: Storage and management of structured and unstructured data
- Cloud monitoring: monitoring the performance and stability of cloud-based applications
- Data visualization and reporting
- Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI
Requirements
- Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.
- Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.
- Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP
- Combined with cloud storage technologies, cloud monitoring and cloud secret management
- Excellent communication skills to effectively collaborate with team members and stakeholders.
Nice-to-Have:
- Knowledge of agile methodologies and working in cross-functional, collaborative teams.



- Experience building and managing large scale data/analytics systems.
- Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good
understanding of software design principles and architectural best practices.
- Be passionate about writing code and have experience coding in multiple languages, including at least
one scripting language, preferably Python.
- Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision
is right/wrong, and so on.
- Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
- Have exposure and working knowledge in AI environment with Machine learning experience
- Have experience working with multiple storage and indexing technologies such as MySQL, Redis,
MongoDB, Cassandra, Elastic.
- Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
- Use the command line like a pro. Be proficient in Git and other essential software development tools.
- Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
- Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana,
Graylog, StatsD, Datadog etc


- Compile and analyze data, processes, and codes to troubleshoot problems and identify areas for improvement
- Collaborating with the front-end developers and other team members to establish objectives and design more functional, cohesive codes to enhance the user experience
- Developing ideas for new programs, products, or features by monitoring industry developments and trends
- Recording data and reporting it to proper parties, such as clients or
-
leadership
-
- Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members
- Taking lead on projects, as

product is an end-to-end S2P solution designed to transform procurement for product manufacturing
companies across industries. Our singular focus is to create a truly distinctive procurement platform that
delights users and provides a sustainable positive impact on the organizations we serve. We achieve this by
providing transparency and insights to leaders, streamlining and automating processes to improve efficiency,
and driving bottom-line impact by unlocking savings potential.
FactWise received funding from a US-based VC, and we are currently deploying our MVP to initial clients.
With exciting sales conversations in advanced stages across Europe, US, and India markets, we have strong
relations with leading VC firms and a great journey ahead. Our core team has members from MIT,
Stanford, McKinsey, Amazon US, and we’d love to be joined by passionate, hardworking self-starters
looking to align their growth journey with ours.
Responsibilities:
As a member of the development group, you will be primarily responsible for the design, development,
and maintenance of the product:
• Help define and create full-stack architecture and deployment using React- Django-AWS in an
an agile environment with lots of ownership and active mentoring
• Work with the Product and Design teams to build new features to solve business problems and fill
business needs
• Participate in code reviews to create robust and maintainable code
• Work in an agile environment where quick iterations and good feedback are a way of life
• Interact with other stakeholders for requirements, design discussions, and for the adoption of new
features
• Communicate and coordinate with our support and professional services teams to solve customer
issues
• Help scale our platform as we expand our product across various markets and verticals globally
As a young, fresh startup, we are hoping to be joined by self-starting, hardworking, passionate individuals
who are committed to delivering their best, who can grow into future leaders of FactWise

We are looking for a full-time remote Senior Backend Developer who has worked with big data and stream processing, to solve big technical challenges at scale that will reshape the healthcare industry for generations. You will get the opportunity to be involved in the big data engineering, novel machine learning pipelines and highly scalable backend development. The successful candidates will be working in a team of highly skilled and experienced developers, data scientists and CTO.
Job Requirements
1) Writing well tested, readable code using Python that is capable of processing large volumes of data
2) Experience with cloud platforms such as GCP, Azure or AWS are essential
3) The ability to work to project deadlines efficiently and with minimum guidance
4) A positive attitude and love working within a global distributed team
Skills
1) Highly proficient working with Python
2)Comfort working with large data sets and high velocity data streams
3) Experienced with microservices and backend services
4) Good relational and NoSQL database working knowledge
5) An interest in healthcare and medical sectors
6) Technical degree with minimum of 2 plus years- backend data heavy development or data engineering experience in Python
7) Desirable ETL/ELT
8) Desirable Apache Spark and big data pipelines, and stream data processing (e.g. Kafka, Flink, Kinesis, Event Hub)

Strong understanding of software development cycle
-Hands on experience in Python and familiarity with at least one framework, preferably Django
-Experience in third party integrations.
-Strong understanding of relational databases (MySql, Postgresql etc.)
-Comfortable with search engines like ElasticSearch.
-Hands on experience of AWS services.
-Knowledge of version control tools like Git/SVN.
-Strong unit testing and debugging skills.
-Good understanding of data structures, algorithms and design patterns.
-Good analytical and problem-solving skills.
Fluency or understanding of specific languages such Java, PHP, HTML or Python .and Operating System.
Good to have:
-Hands on experience of AWS services.
-Good exposure in writing and optimising SQL(such as PostgreSQL) for high-performance systems with large databases.
-Exposure at handling server side issues and quick resolution.
-Experience working on scalable, high availability applications/services.

We are looking for a Node.js Developer who is proficient with writing API's, working with data, using AWS and capable of applying algorithms mainly machine learning-based to solve problems and create/modify features for our students. Your primary focus will be the development of all server-side logic, definition and maintenance of the central database, and ensuring high performance and responsiveness to requests from the front-end. You will also be responsible for integrating the front-end elements built by your co-workers into the application. Therefore, a basic understanding of front-end technologies is necessary as well.
Responsibilities
- Integration of user-facing elements developed by front-end developers with server-side logic
- Writing reusable, testable, and efficient code
- Design and implementation of low-latency, high-availability, and performant applications
- Implementation of security and data protection
- Use of algorithms to drive data analytics and features.
- Ability to use AWS to solve scale issues.
Apply if you can only arrive for a face to face interview in Bangalore.



