Interfaces with other processes and/or business functions to ensure they can leverage the
benefits provided by the AWS Platform process
Responsible for managing the configuration of all IaaS assets across the platforms
Hands-on python experience
Manages the entire AWS platform(Python, Flask, RESTAPI, serverless framework) and
recommend those that best meet the organization's requirements
Has a good understanding of the various AWS services, particularly: S3, Athena, Python code,
Glue, Lambda, Cloud Formation, and other AWS serverless resources.
AWS Certification is Plus
Knowledge of best practices for IT operations in an always-on, always-available service model
Responsible for the execution of the process controls, ensuring that staff comply with process
and data standards
Qualifications
Bachelor’s degree in Computer Science, Business Information Systems or relevant experience and
accomplishments
3 to 6 years of experience in the IT field
AWS Python developer
AWS, Serverless/Lambda, Middleware.
Strong AWS skills including Data Pipeline, S3, RDS, Redshift with familiarity with other components
like - Lambda, Glue, Step functions, CloudWatch
Must have created REST API with AWS Lambda.
Python relevant exp 3 years
Good to have Experience working on projects and problem solving with large scale multivendor
teams.
Good to have knowledge on Agile Development
Good knowledge on SDLC.
Hands on AWS Databases, (RDS, etc)
Good to have Unit testing exp.
Good to have CICD working knowledge.
Decent communication, as there will be client interaction and documentation.
Education (degree): Bachelor’s degree in Computer Science, Business Information Systems or relevant
experience and accomplishments
Years of Experience: 3-6 years
Technical Skills
Linux/Unix system administration
Continuous Integration/Continuous Delivery tools like Jenkins
Cloud provisioning and management – Azure, AWS, GCP
Ansible, Chef, or Puppet
Python, PowerShell & BASH
Job Details
JOB TITLE/JOB CODE: AWS Python Develop[er, III-Sr. Analyst
RC: TBD
PREFERRED LOCATION: HYDERABAD, IND
POSITION REPORTS TO: Manager USI T&I Cloud Managed Platform
CAREER LEVEL: 3
Work Location:
Hyderabad

Similar jobs
Roles:
- Developing core infrastructure in Python, Django.
- Developing models and business logic (e. g. transactions, payments, diet plan, search, etc).
- Architecting servers and services that enable new product features.
- Building out newly enabled product features
- Minimum 4 years of industry or open-source experience.
- Proficient in at least one OO language: Python(preferred)/Golang/Java.
- Writing high-performance, reliable and maintainable code.
- Good knowledge of database structures, theories, principles, and practices.
- Experience working with AWS components [EC2, S3, RDS, SQS, ECS, Lambda].
CTC Budget: 35-55LPA
Location: Hyderabad (Remote after 3 months WFO)
Company Overview:
An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.
- 6 plus years of experience as a Python developer.
- Experience in web development using Python and Django Framework.
- Experience in Data Analysis and Data Science using Pandas, Numpy and Scifi-Kit - (GTH)
- Experience in developing User Interface using HTML, JavaScript, CSS.
- Experience in server-side templating languages including Jinja 2 and Mako
- Knowledge into Kafka and RabitMQ (GTH)
- Experience into Docker, Git and AWS
- Ability to integrate multiple data sources into a single system.
- Ability to collaborate on projects and work independently when required.
- DB (MySQL, Postgress, SQL)
Selection Process: 2-3 Interview rounds (Tech, VP, Client)
Your responsibilities as a backend engineer will include:
- Back-end software development
- Software engineering and designing data models and write effective APIs
- Working together with engineers and product teams
- Understanding business use cases and requirements for different internal teams
- Maintenance of existing projects and New feature development
- Consume and integrate classifier/ ML snippets from Data science team
What we are looking for:
- 4+ years of industry experience with the Python and Django framework.
- Degree in Computer Science or related field
- Good analytical skills with strong fundamentals of data structures and algorithms
- Experience building backend services with hands-on experience through all stages of Agile software development life cycle.
- Ability to write optimized codes,debug programs, and integrate applications with third party tools by developing various APIs
- Experience with Databases (Relational and Non-Relational). Ex: Cassandra, MongoDB, Postgresql
- Experience with writing REST-APIs.
- Prototyping initial collection and leveraging existing tools and/or creating new tools
- Experience working different types of datasets (e.g. unstructured, semi-structured, with missing information)
- Ability to think critically and creatively in a dynamic environment, while picking up new tools and domain knowledge along the way
- A positive attitude, and a growth mindset
Bonus:
- Experience with relevant Python libraries such as Sklearn, NLTK, tensorflow, HuggingFace Transformers
- Hands on experience in Machine learning implementations
- Experience with Cloud infrastructure (e.g. AWS) and relevant microservices
- Good with Humor and Team player
We are looking for a Backend Engineer at Prescribe You will be responsible for architecting and developing backend for the features being added in the mobile app. You will be joining a talented, collaborative team that is very passionate about solving this massive problem.
Location:
Work from Home
About Prescribe:
Prescribe (YC W21) is one of India's fastest-growing startups in the Healthcare sector, founded by IIT Madras alumni. We are building a D2C brand GetYara in the natural healthcare space.
Requirements and Responsibilities:
Below is a list of several skills required to deliver on responsibilities for this role:
- Comfortable with AWS infrastructure.
- Sound knowledge in JS ES6.
- Great at problem-solving
- Experience with Amplify and Cloud formation is a plus.
- Bonus: Graphql, Elastic Search, SQL
Benefits
- Work flexibility
- Medical insurance
- Work from Home
- Stock Options based on performance
If you always thought of yourself as entrepreneurial, customer-obsessed, results-oriented, strategic yet execution-focused, hungry and passionate about technology, we have a dream opportunity to back yourself up.
1: proficient in python, flask, pandas, GitHub and AWS
2: good knowledge of databases both SQL and NoSQL
3:Strong experience in REST and SOAP APIs
4: Experience with working on scalable interactive web applications
5:Basic knowledge of JavaScript and Html
6: Automation and crawling tools and modules
7: Multithreading and Multiprocessing
8:Good Understanding of test-driven Development
9: Preferred exposure to finance domain
-
Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
-
Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
▪ Distributed Cloud Native Computing including Server less Functions
▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
▪ Micro services Architecture, API Modeling, Design, & Programming -
3+ years of hands-on development experience in Apache Spark using Scala and/or Java.
-
Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.
-
In-depth knowledge of standard programming languages such as Scala and/or Java.
-
3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.
-
3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.
-
Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.
-
Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.
-
Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.
-
Perform benchmarking/stress tests and document the best practices for different applications.
-
Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.
-
Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.
-
Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.
Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.
-
Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.
-
Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.
-
Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.
GeoSpoc is looking for passionate backend developers who would like to solve complex business problems using location-aware data and cutting-edge tools and technologies such as microservices and cloud platforms such as AWS.
General skill
- A passionate developer with solid understanding of software basics
- Always willing to learn and explore upcoming technologies
- Pro-active, reliable and result oriented
- Someone who can continuously perform in a fast-paced environment
Key Skills Required
- Write effective, scalable code
- Develop back-end components to improve responsiveness and overall performance
- Integrate user-facing elements into applications
- Test and debug programs
- Improve functionality of existing systems
- Implement security and data protection solutions
- Assess and prioritize feature requests
- Coordinate with internal teams to understand user requirements and provide technical solutions
Skills and Experience
- Expertise in at least one popular Python framework (like Django, Flask or Pyramid)
- Knowledge of object-relational mapping (ORM)
- Familiarity with front-end technologies (like JavaScript and HTML5)
Responsibilities
- Writing and testing code, debugging programs and integrating applications with third-party web services
- Work closely with small teams of designers, frontend developers, GIS experts as well as business stakeholders
- Own the development lifecycle of backend systems from design to deployment
- Go above and beyond to deliver great quality software solutions on time
At Cityflo, we are solving the problem of commuting to the office which affects employees in big, populated Indian cities every day. Cityflo provides a bus experience like no other - we run premium AC buses for daily commuters. We’re changing the way urban Indians commute and enabling everyone to reclaim hours of their time every day. Before the imposed lockdown due to coronavirus, we were serving about 7500 commuters per day. We plan to scale to 5,00,000 per day in the next 4 years in a profitable and sustainable manner.
You can read more about our engineering and culture on our blog: https://blog.cityflo.com/tag/engineering/" target="_blank">https://blog.cityflo.com/tag/engineering/
Role & Requirements
- 2+ year experience in application development
- ability to write efficient SQL queries and design schemas for relational databases
- good knowledge of operating systems and networking concepts
- experience in using and understanding code from Open Source
- experience with implementing best software engineering practices like version control with git, code reviews, writing unit-tests, writing readable code
- experience with Python and Django is a plus.
- inclination towards researching new technologies and adapt them to solve challenges we face
We look for engineers who
- Are committed to their growth and learning
- care about working in and building a strong engineering culture
- want to take significant ownership and decision making power
- want to make an impact in the real world while working with a great team in a hyper-growth environment
- 3+ years of work experience as a Python Developer.
- Sound understanding and knowledge of Python and its ecosystem libraries like pandas and numpy. Also be able to write modular code and understand the python packaging system.
- Experience on web crawling, scraping (Scrapy,BeautifulSoup, Selenium) and web application development using Django/Flask Framework.
- Experience on Data-Science; exposure to Theanos, Tensorflow, Pytorch (preferable)
- Exposure to data mining, Pyspark (preferable)








