50+ Python Jobs in Pune | Python Job openings in Pune
Apply to 50+ Python Jobs in Pune on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

We are looking for a QA Automation Engineer who has the following expertise -
- Must have hands-on automation experience in Java/Python + Selenium for testing web applications
- Strong Functional Testing fundamentals
- Proficient in bug tracking and test management tool sets (JIRA, GIT etc.)
- Hands-on experience in Karate/Cucumber framework. Exposure to Gherkin test script.
- Strong experience and hands-on in Automation using Java/Python & Selenium
- Must have experience of designing and implementing test frameworks (like Data driven, Keyword driven or Hybrid along with custom reporting) and strategy for choosing automated testing tools and creating testing standards
- Must have experience in automating different layers (Front end, backend, web-services) of application using different automation approach.
- Good to have exposure in SQL for database driven testing
- Strong experience in Test Strategy, test plan, Test design, test execution traceability matrix, test report and ensure usage of tools for optimization
- Team player - highly proactive team player eager to support your colleagues when needed
- You are prepared to take on responsibility for tasks and work independently
- You present excellent organizational and time management skills
- Knowledge or basic experience of Agile development processes (SCRUM) is a great plus
Other non-negotiable requirements are -
- Good academics
- Good communication skills
- Candidates from Pune or ready to commute to Pune for official work purposes will be preferred. Ready and immediately available candidates will be preferred.
About Tech Prescient - We are a technology-based product development engineering company, working with technology customers to build their products. We work with customers to design and develop their product stack and hence, the quality of work we produce is always premium. We are looking for equally motivated people to join our vibrant team and am sure we will make it a win-win situation
Data Engineers develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems. On other projects, you might be acting as the architect, leading the design of technical solutions or perhaps overseeing a program inception to build a new product. It could also be a software delivery project where you're equally happy coding and tech-leading the team to implement the solution.
Job Responsibilities
- You will partner with teammates to create complex data processing pipelines to solve our clients' most complex challenges
- You will collaborate with Data Scientists to design scalable implementations of their models
- You will pair to write clean and iterative code based on TDD
- Leverage various continuous delivery practices to deploy, support and operate data pipelines
- Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available
- Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions
- Create data models and speak to the tradeoffs of different modeling approaches
- Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process
- Assure effective collaboration between Thoughtworks' and the client's teams, encouraging open communication and advocating for shared outcomes
Job Qualifications
Technical skills
- You have a good understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, and Hadoop
- You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting
- Hands-on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions
- You are comfortable taking data-driven approaches and applying data security strategies to solve business problems
- Working with data excites you: you can build and operate data pipelines, and maintain data storage, all within distributed systems
- You're genuinely excited about data infrastructure and operations with a familiarity working in cloud environments
Professional skills
- You're resilient and flexible in ambiguous situations and enjoy solving problems from technical and business perspectives
- An interest in coaching, sharing your experience and knowledge with teammates
- You enjoy influencing others and always advocate for technical excellence while being open to change when needed
- Presence in the external tech community: you willingly share your expertise with others via speaking engagements, contributions to open source, blogs and more
Other things to know
Learning & Development
There is no one-size-fits-all career path at Thoughtworks: however you want to develop your career is entirely up to you. But we also balance autonomy with the strength of our cultivation culture. This means your career is supported by interactive tools, numerous development programs and teammates who want to help you grow. We see value in helping each other be our best and that extends to empowering our employees in their career journeys.
About Thoughtworks
Thoughtworks is a global technology consultancy that integrates strategy, design and engineering to drive digital innovation. For over 30 years, our clients have trusted our autonomous teams to build solutions that look past the obvious. Here, computer science grads come together with seasoned technologists, self-taught developers, midlife career changers and more to learn from and challenge each other. Career journeys flourish with the strength of our cultivation culture, which has won numerous awards around the world.
Join Thoughtworks and thrive. Together, our extra curiosity, innovation, passion and dedication overcomes ordinary.
To be successful in this role, you should possess
- Overall Industry experience 2-6 years
- Bachelor’s Degree in analytical subject area. E.g., Engineering, Statistics.... etc.
- Proficient in advanced Excel functions and macros, involving complex calculations and pivots
- Exceptional Analytical, problem solving & Logical skills.
- Understanding of relational database concepts and familiar with SQL
- Demonstrable aptitude for Innovation & Problem solving.
- Good communication skills & ability to work across Cross-functional teams.
- Understands complex utility tariffs, rates and programs and converts these into a model.
- Participates in sprint planning & other ceremonies, passionately works towards fulfilling the committed sprint goals.
- Knowledge of and ability to automate routine tasks using Python is a PLUS
Preferred Qualifications:
- Experience in Energy Industry & familiar with basic concepts of Utility (electrical/gas...) tariffs
- Experience & Knowledge with tools like; Microsoft Excel macros,
- Familiarity with writing programs using Python or Shell scripts.
- Passionate about working with data and data analyses.
- 1+ year experience in Agile methodology.
Roles and responsibilities
- Understands complex utility tariffs, rates and programs and converts these into a model.
- Responsible for analysis of the energy utilization data for cost and usage patterns and derive meaningful patterns
- Responsible for Maintaining the tariff models with timely updates for price, logic or other enhancements as per client requirement.
- Assist delivery team in validating the input data received from client for modelling work.
- Responsible for Communicating & Co-ordinating with Delivery team
- Work with Cross functional teams to resolve issues in the Modelling tool.
- Build and deliver compelling demonstrations/visualizations of products
- Be a lifelong learner and develop your skills continuously
- Contribute to the success of a rapidly growing and evolving organization
Additional Project/Soft Skills:
- Should be able to work independently with India & US based team members.
- Strong verbal and written communication with ability to articulate problems and solutions over phone and emails.
- Strong sense of urgency, with a passion for accuracy and timeliness.
- Ability to work calmly in high pressure situations and manage multiple projects/tasks.
- Ability to work independently and possess superior skills in issue resolution.
Title/Role: Python Django Consultant
Experience: 8+ Years
Work Location: Indore / Pune /Chennai / Vadodara
Notice period: Immediate to 15 Days Max
Key Skills: Python, Django, Crispy Forms, Authentication, Bootstrap, jQuery, Server Side Rendered, SQL, Azure, React, Django DevOps
Job Description:
- Should have knowledge and created forms using Django. Crispy forms is a plus point.
- Must have leadership experience
- Should have good understanding of function based and class based views.
- Should have good understanding about authentication (JWT and Token authentication)
- Django – at least one senior with deep Django experience. The other 1 or 2 can be mid to senior python or Django
- FrontEnd – Must have React/ Angular, CSS experience
- Database – Ideally SQL but most senior has solid DB experience
- Cloud – Azure preferred but agnostic
- Consulting / client project background ideal.
Django Stack:
- Django
- Server Side Rendered HTML
- Bootstrap
- jQuery
- Azure SQL
- Azure Active Directory
- Server Side Rendered/jQuery is older tech but is what we are ok with for internal tools. This is a good combination of late adopter agile stack integrated within an enterprise. Potentially we can push them to React for some discreet projects or pages that need more dynamism.
Django Devops:
- Should have expertise with deploying and managing Django in Azure.
- Django deployment to Azure via Docker.
- Django connection to Azure SQL.
- Django auth integration with Active Directory.
- Terraform scripts to make this setup seamless.
- Easy, proven to deployment / setup to AWS, GCP.
- Load balancing, more advanced services, task queues, etc.
Job Location- Kharadi, Pune
Job Duration- 6 Months
About Us :
NonStop io Technologies Pvt. Ltd. is a Pune-based bespoke
engineering studio that provides Product Development as an Expertise in software development. The company works with multiple early-stage and funded startups based out of San Francisco, Seattle, New York, London, and other prominent technology hubs all around the world.
Job Responsibilities :
Data Collection | PreProcessing | PostProcessing
- Assist in data collection and preprocessing tasks, including data acquisition, cleaning, and labeling
- Collaborate with the team to ensure high-quality data is available for model training
Experimentation and Evaluation
- Conduct experiments to fine-tune and optimize machine learning models
- Evaluate and document the performance of models and propose improvements
Data Visualization
- Create clear and informative data visualizations to communicate insights and findings effectively
Documentation
- Maintain detailed documentation of experiments, methodologies, and code to ensure transparency and knowledge sharing within the team
Research and Innovation
- Stay up-to-date with the latest advancements in AI and machine learning by reviewing relevant research papers
- Contribute to brainstorming sessions and innovative ideas for projects
Collaboration
- Collaborate with cross-functional teams, including data scientists, software engineers, and product managers, to deliver AI solutions that meet business objectives
Model Development
- Work closely with AI/ML engineers and researchers to develop and implement machine learning models
- Participate in algorithm design, feature engineering, and model training and evaluation
Qualifications :
- Currently pursuing a Bachelor's or Master's degree in Computer Science, Data Science, Machine Learning, or a related field
- Strong programming skills in languages such as Python
- Familiarity with platforms like OpenAI, Llama, Bard, Huggingface etc.
- Familiarity with machine learning libraries (e.g., TensorFlow, PyTorch, scikit-learn)
- Basic knowledge of data manipulation and analysis tools (e.g., pandas, NumPy)
- Good understanding of machine learning fundamentals and algorithms
- Strong problem-solving and critical-thinking skills
- Excellent communication and teamwork abilities
- Eagerness to learn and adapt to new challenges
- Previous projects or coursework related to AI/ML is a plus
Benefits :
- Gain hands-on experience in AI/ML in a real-world setting
- Work closely with a team of experienced professionals
- Exposure to cutting-edge AI technologies and methodologies
- Mentorship and guidance throughout the internship
- Potential for future career opportunities based on performance
- Competitive stipend
1E9 Advisors is seeking a Python / Django developer to build cutting-edge valuation, risk management, and dispatch/optimization software applicable to renewable generation and battery energy storage solution space.
A deep knowledge of data structures and algorithms is essential for this role. We build everything in Python / Django, so expertise in both is important to us.
We are seeking candidates who are numerate and have significant expertise in programming and Python.
RESPONSIBILITIES
- Develop applications
- Test code
- Write documentation
- Deploy applications
- Maintain live services
CORE REQUIREMENTS
- Curiosity
- Analytical skills
- Obsession with details
- Excellent communication skills
- Experience with one or more general-purpose languages
- Expertise in Python 3.7+
ROLE-SPECIFIC SKILLS
- Django
- HTML5
- CSS3
- Bootstrap
- Javascript
- ReactJS / VueJS / AngularJS
- jQuery
- Linux
- Git + GitHub
PREFERRED SKILLS
- Shell scripting
- Pandas
- D3 & Observable
- Flask
We are looking for a hands-on technical expert who has worked with multiple technology stacks and has experience architecting and building scalable cloud solutions with web and mobile frontends.
What will you work on?
- Interface with clients
- Recommend tech stacks
- Define end-to-end logical and cloud-native architectures
- Define APIs
- Integrate with 3rd party systems
- Create architectural solution prototypes
- Hands-on coding, team lead, code reviews, and problem-solving
What Makes You A Great Fit?
- 5+ years of software experience
- Experience with architecture of technology systems having hands-on expertise in backend, and web or mobile frontend
- Solid expertise and hands-on experience in Python with Flask or Django
- Expertise on one or more cloud platforms (AWS, Azure, Google App Engine)
- Expertise with SQL and NoSQL databases (MySQL, Mongo, ElasticSearch, Redis)
- Knowledge of DevOps practices
- Chatbot, Machine Learning, Data Science/Big Data experience will be a plus
- Excellent communication skills, verbal and written
The job is for a full-time position at our https://goo.gl/maps/o67FWr1aedo">Pune (Viman Nagar) office.
(Note: We are working remotely at the moment. However, once the COVID situation improves, the candidate will be expected to work from our office.)
Hiring alert 🚨
Calling all #PythonDevelopers looking for an #ExcitingJobOpportunity 🚀 with one of our #Insurtech clients.
Are you a Junior Python Developer eager to grow your skills in #BackEnd development?
Our company is looking for someone like you to join our dynamic team. If you're passionate about Python and ready to learn from seasoned developers, this role is for you!
📣 About the company
The client is a fast-growing consultancy firm, helping P&C Insurance companies on their digital journey. With offices in Mumbai and New York, they're at the forefront of insurance tech. Plus, they offer a hybrid work culture with flexible timings, typically between 9 to 5, to accommodate your work-life balance.
💡 What you’ll do
📌 Work with other developers.
📌 Implement Python code with assistance from senior developers.
📌 Write effective test cases such as unit tests to ensure it is meeting the software design requirements.
📌 Ensure Python code when executed is efficient and well written.
📌 Refactor old Python code to ensure it follows modern principles.
📌 Liaise with stakeholders to understand the requirements.
📌 Ensure integration can take place with front end systems.
📌 Identify and fix code where bugs have been identified.
🔎 What you’ll need
📌 Minimum 3 years of experience writing AWS Lambda using Python
📌 Knowledge of other AWS services like CloudWatch and API Gateway
📌 Fundamental understanding of Python and its frameworks.
📌 Ability to write simple SQL queries
📌 Familiarity with AWS Lambda deployment
📌 The ability to problem-solve.
📌 Fast learner with an ability to adapt techniques based on requirements.
📌 Knowledge of how to effectively test Python code.
📌 Great communication and collaboration skills.
QA Automation - Locusnine Innovations Pvt. Ltd.
What we need
We are looking for a strong QA automation engineer to join our team. We create new technology every day, new ways to do things, and new ways to connect, collect and present data, this person will be responsible for making sure that via technology we continuously and sustainably deliver quality solutions to our customers. You are expected to have at least 3 years of professional experience in the automation field testing sustainably at least one of UI or APIs. You will be expected to write test automation suites from scratch for large product suites and will work with a highly driven set of people as a part of the team.
Responsibilities
- Design, develop, and maintain automated test scripts for web and mobile applications - Work with the development team to identify and prioritize test cases
- Execute automated tests and troubleshoot any issues
- Analyze test results and report on the findings
- Stay up-to-date on the latest automation technologies
Who we think will be a great fit…
We are looking for a QA Secured API testing and API Automation engineer to join our team. The ideal candidate will have experience in designing, developing, and executing automated API tests. They will also have experience in securing APIs against common vulnerabilities. The ideal candidate will have experience with Selenium IDE, Selenium Base, Python, and Secured API testing.
You also meet most (if not more) of the following requirements:
- Bachelor's degree in computer science, software engineering, or a related field - 3+ years of experience in QA automation
- Strong programming skills in Python
- Experience with Selenium IDE, Selenium Base, and other automation tools - Knowledge of Secured API testing
- Excellent problem-solving and analytical skills
- Ability to work independently and as part of a team
Full Stack Developer Job Description
Position: Full Stack Developer
Department: Technology/Engineering
Location: Pune
Type: Full Time
Job Overview:
As a Full Stack Developer at Invvy Consultancy & IT Solutions, you will be responsible for both front-end and back-end development, playing a crucial role in designing and implementing user-centric web applications. You will collaborate with cross-functional teams including designers, product managers, and other developers to create seamless, intuitive, and high-performance digital solutions.
Responsibilities:
Front-End Development:
Develop visually appealing and user-friendly front-end interfaces using modern web technologies such as C# Coding, HTML5, CSS3, and JavaScript frameworks (e.g., React, Angular, Vue.js).
Collaborate with UX/UI designers to ensure the best user experience and responsive design across various devices and platforms.
Implement interactive features, animations, and dynamic content to enhance user engagement.
Optimize application performance for speed and scalability.
Back-End Development:
Design, develop, and maintain the back-end architecture using server-side technologies (e.g., Node.js, Python, Ruby on Rails, Java, .NET).
Create and manage databases, including data modeling, querying, and optimization.
Implement APIs and web services to facilitate seamless communication between front-end and back-end systems.
Ensure security and data protection by implementing proper authentication, authorization, and encryption measures.
Collaborate with DevOps teams to deploy and manage applications in cloud environments (e.g., AWS, Azure, Google Cloud).
Qualifications:
Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience).
Proven experience as a Full Stack Developer or similar role.
Proficiency in front-end development technologies like HTML5, CSS3, JavaScript, and popular frameworks (React, Angular, Vue.js, etc.).
Strong experience with back-end programming languages and frameworks (Node.js, Python, Ruby on Rails, Java, .NET, etc.).
Familiarity with database systems (SQL and NoSQL) and their integration with web applications.
Knowledge of web security best practices and application performance optimization.
Skills and Qualifications:
- Minimum 3 years of professional experience in Golang development.
- Strong proficiency in Golang programming language, including knowledge of Go routines, channels, and error handling.
- Proficiency in Terraform for provisioning and managing infrastructure.
- Hands-on experience with AWS services such as EC2, S3, RDS, Lambda, and CloudFormation.
- Solid understanding of software development principles, data structures, and algorithms.
- Familiarity with containerization technologies like Docker and orchestration tools like Kubernetes.
- Experience with version control systems, such as Git.
- Knowledge of software testing frameworks and methodologies.
- Strong problem-solving and analytical skills.
- Excellent teamwork and communication skills.
- Ability to work independently and manage multiple tasks and priorities effectively.
We're seeking an AI/ML Engineer to join our team
As an AI/ML Engineer, you will be responsible for designing, developing, and implementing artificial intelligence (AI) and machine learning (ML) solutions to solve real world business problems. You will work closely with cross-functional teams, including data scientists, software engineers, and product managers, to deploy and integrate Applied AI/ML solutions into the products that are being built at NonStop io. Your role will involve researching cutting-edge algorithms, data processing techniques, and implementing scalable solutions to drive innovation and improve the overall user experience.
Responsibilities
- Applied AI/ML engineering; Building engineering solutions on top of the AI/ML tooling available in the industry today. Eg: Engineering APIs around OpenAI
- AI/ML Model Development: Design, develop, and implement machine learning models and algorithms that address specific business challenges, such as natural language processing, computer vision, recommendation systems, anomaly detection, etc.
- Data Preprocessing and Feature Engineering: Cleanse, preprocess, and transform raw data into suitable formats for training and testing AI/ML models. Perform feature engineering to extract relevant features from the data
- Model Training and Evaluation: Train and validate AI/ML models using diverse datasets to achieve optimal performance. Employ appropriate evaluation metrics to assess model accuracy, precision, recall, and other relevant metrics
- Data Visualization: Create clear and insightful data visualizations to aid in understanding data patterns, model behavior, and performance metrics
- Deployment and Integration: Collaborate with software engineers and DevOps teams to deploy AI/ML models into production environments and integrate them into various applications and systems
- Data Security and Privacy: Ensure compliance with data privacy regulations and implement security measures to protect sensitive information used in AI/ML processes
- Continuous Learning: Stay updated with the latest advancements in AI/ML research, tools, and technologies, and apply them to improve existing models and develop novel solutions
- Documentation: Maintain detailed documentation of the AI/ML development process, including code, models, algorithms, and methodologies for easy understanding and future reference
Requirements
- Bachelor's, Master's or PhD in Computer Science, Data Science, Machine Learning, or a related field. Advanced degrees or certifications in AI/ML are a plus
- Proven experience as an AI/ML Engineer, Data Scientist, or related role, ideally with a strong portfolio of AI/ML projects
- Proficiency in programming languages commonly used for AI/ML. Preferably Python
- Familiarity with popular AI/ML libraries and frameworks, such as TensorFlow, PyTorch, scikit-learn, etc.
- Familiarity with popular AI/ML Models such as GPT3, GPT4, Llama2, BERT etc.
- Strong understanding of machine learning algorithms, statistics, and data structures
- Experience with data preprocessing, data wrangling, and feature engineering
- Knowledge of deep learning architectures, neural networks, and transfer learning
- Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud) for scalable AI/ML deployment
- Solid understanding of software engineering principles and best practices for writing maintainable and scalable code
- Excellent analytical and problem-solving skills, with the ability to think critically and propose innovative solutions
- Effective communication skills to collaborate with cross-functional teams and present complex technical concepts to non-technical stakeholders
We're seeking an experienced Backend Software Engineer to join our team.
As a backend engineer, you will be responsible for designing, developing, and deploying scalable backends for the products we build at NonStop.
This includes APIs, databases, and server-side logic.
Responsibilities
- Design, develop, and deploy backend systems, including APIs, databases, and server-side logic
- Write clean, efficient, and well-documented code that adheres to industry standards and best practices
- Participate in code reviews and contribute to the improvement of the codebase
- Debug and resolve issues in the existing codebase
- Develop and execute unit tests to ensure high code quality
- Work with DevOps engineers to ensure seamless deployment of software changes
- Monitor application performance, identify bottlenecks, and optimize systems for better scalability and efficiency
- Stay up-to-date with industry trends and emerging technologies; advocate for best practices and new ideas within the team
- Collaborate with cross-functional teams to identify and prioritize project requirements
Requirements
- At least 3+ years of experience building scalable and reliable backend systems
- Strong proficiency in either of the programming languages such as Python, Node.js, Golang, RoR
- Experience with either of the frameworks such as Django, Express, gRPC
- Knowledge of database systems such as MySQL, PostgreSQL, MongoDB, Cassandra, or Redis
- Familiarity with containerization technologies such as Docker and Kubernetes
- Understanding of software development methodologies such as Agile and Scrum
- Ability to demonstrate flexibility wrt picking a new technology stack and ramping up on the same fairly quickly
- Bachelor's/Master's degree in Computer Science or related field
- Strong problem-solving skills and ability to collaborate effectively with cross-functional teams
- Good written and verbal communication skills in English
Who We Are:
DeepIntent is leading the healthcare advertising industry with data-driven solutions built for the future. From day one, our mission has been to improve patient outcomes through the artful use of advertising, data science, and real-world clinical data.
What You’ll Do:
We are looking for a Senior Software Engineer based in Pune, India who can master both DeepIntent’s data architectures and pharma research and analytics methodologies to make significant contributions to how health media is analyzed by our clients. This role requires an Engineer who not only understands DBA functions but also how they impact research objectives and can work with researchers and data scientists to achieve impactful results.
This role will be in the Analytics Organization and will require integration and partnership with the Engineering Organization. The ideal candidate is a self-starter who is inquisitive who is not afraid to take on and learn from challenges and will constantly seek to improve the facets of the business they manage. The ideal candidate will also need to demonstrate the ability to collaborate and partner with others.
- Serve as the Engineering interface between Analytics and Engineering teams
- Develop and standardized all interface points for analysts to retrieve and analyze data with a focus on research methodologies and data based decisioning
- Optimize queries and data access efficiencies, serve as expert in how to most efficiently attain desired data points
- Build “mastered” versions of the data for Analytics specific querying use cases
- Help with data ETL, table performance optimization
- Establish formal data practice for the Analytics practice in conjunction with rest of DeepIntent
- Build & operate scalable and robust data architectures
- Interpret analytics methodology requirements and apply to data architecture to create standardized queries and operations for use by analytics teams
- Implement DataOps practices
- Master existing and new Data Pipelines and develop appropriate queries to meet analytics specific objectives
- Collaborate with various business stakeholders, software engineers, machine learning engineers, analysts
- Operate between Engineers and Analysts to unify both practices for analytics insight creation
Who You Are:
- Adept in market research methodologies and using data to deliver representative insights
- Inquisitive, curious, understands how to query complicated data sets, move and combine data between databases
- Deep SQL experience is a must
- Exceptional communication skills with ability to collaborate and translate with between technical and non technical needs
- English Language Fluency and proven success working with teams in the U.S.
- Experience in designing, developing and operating configurable Data pipelines serving high volume and velocity data
- Experience working with public clouds like GCP/AWS
- Good understanding of software engineering, DataOps, and data architecture, Agile and DevOps methodologies
- Experience building Data architectures that optimize performance and cost, whether the components are prepackaged or homegrown
- Proficient with SQL,Python or JVM based language, Bash
- Experience with any of Apache open source projects such as Spark, Druid, Beam, Airflow etc.and big data databases like BigQuery, Clickhouse, etc
- Ability to think big, take bets and innovate, dive deep, hire and develop the best talent, learn and be curious
- Comfortable to work in EST Time Zone
What You’ll Do:
We are currently seeking a highly skilled and motivated Product Analyst to join our team. The ideal candidate will be responsible for working closely with multiple stakeholders within the company to determine the metrics needed to make critical business decisions. This person should have a strong background in data analysis, Google Analytics, Looker, SQL, and other relevant skills for a Product Analytics Professional.
- Collaborate with cross-functional teams including Product Management, UX, Engineering, and Marketing to identify key metrics that drive the business
- Develop and maintain detailed dashboards and reports using tools like Google Analytics, Looker, and SQL to provide actionable insights to stakeholders.
- Develop Python automation scripts to extract, transform, cleanse, and aggregate data for analytics purposes
- Conduct deep-dive analyses to identify trends, opportunities, and areas for improvement within our platform
- Communicate findings and recommendations to stakeholders in a clear, concise, and visually appealing manner.
- Ensure data accuracy and consistency by implementing best practices for data management and validation.
- Build and maintain data models that support business intelligence and analytics initiatives
- Design and implement experiments to test hypotheses and drive product improvements
- Stay up-to-date with industry trends and best practices in product analytics
Who You Are:
- 3+ years of experience in product analytics, business intelligence, or data analysis
- Bachelor’s or Master’s degree in Business, Statistics, Mathematics, Computer Science, or a related field is preferred but not required
- Strong knowledge of Python for use in Analytics
- Strong knowledge of Google Analytics, Looker, SQL, and other data analysis tools
- Excellent problem-solving skills with a data-driven mindset
- Experience with A/B testing and experiment design
- Excellent communication and presentation skills
- Strong attention to detail and the ability to manage multiple projects simultaneously
- Comfortable working with local and distributed teams
- Experience working in the AdTech industry is a plus
DeepIntent is leading the healthcare advertising industry with data-driven solutions built for the future. From day one, our mission has been to improve patient outcomes through the artful use of advertising, data science, and real-world clinical data.
What You’ll Do:
We are looking for a talented candidate with several years of experience in software Quality Assurance to join our QA team. This position will be at an individual contributor level as part of a collaborative, fast-paced team. As a member of the QA team, you will work closely with Product Managers and Developers to understand application features and create robust comprehensive test plans, write test cases, and work closely with the developers to make the applications more testable. We are looking for a well-rounded candidate with solid analytical skills, an enthusiasm for taking ownership of features, a strong commitment to quality, and the ability to work closely and communicate effectively with development and other teams. Experience with the following is preferred:
- Python
- Perl
- Shell Scripting
- Selenium
- Test Automation (QA)
- Software Testing (QA)
- Software Development (MUST HAVE)
- SDET (MUST HAVE)
- MySQL
- CI/CD
Who You Are:
- Hands on Experience with QA Automation Framework development & Design (Preferred language Python)
- Strong understanding of testing methodologies
- Scripting
- Strong problem analysis and troubleshooting skills
- Experience in databases, preferably MySQL
- Debugging skills
- REST/API testing experience is a plus
- Integrate end-to-end tests with CI/CD pipelines and monitor and improve metrics around test coverage
- Ability to work in a dynamic and agile development environment and be adaptable to changing requirements
- Performance testing experience with relevant automation and monitoring tools
- Exposure to Dockerization or Virtualization is a plus
- Experience working in the Linux/Unix environment
- Basic understanding of OS
DeepIntent is committed to bringing together individuals from different backgrounds and perspectives. We strive to create an inclusive environment where everyone can thrive, feel a sense of belonging, and do great work together.
DeepIntent is an Equal Opportunity Employer, providing equal employment and advancement opportunities to all individuals. We recruit, hire and promote into all job levels the most qualified applicants without regard to race, color, creed, national origin, religion, sex (including pregnancy, childbirth and related medical conditions), parental status, age, disability, genetic information, citizenship status, veteran status, gender identity or expression, transgender status, sexual orientation, marital, family or partnership status, political affiliation or activities, military service, immigration status, or any other status protected under applicable federal, state and local laws. If you have a disability or special need that requires accommodation, please let us know in advance.
DeepIntent’s commitment to providing equal employment opportunities extends to all aspects of employment, including job assignment, compensation, discipline and access to benefits and training.
The role is with a Fintech Credit Card company based in Pune within the Decision Science team. (OneCard )
About
Credit cards haven't changed much for over half a century so our team of seasoned bankers, technologists, and designers set out to redefine the credit card for you - the consumer. The result is OneCard - a credit card reimagined for the mobile generation. OneCard is India's best metal credit card built with full-stack tech. It is backed by the principles of simplicity, transparency, and giving back control to the user.
The Engineering Challenge
“Re-imaging credit and payments from First Principles”
Payments is an interesting engineering challenge in itself with requirements of low latency, transactional guarantees, security, and high scalability. When we add credit and engagement into the mix, the challenge becomes even more interesting with underwriting and recommendation algorithms working on large data sets. We have eliminated the current call center, sales agent, and SMS-based processes with a mobile app that puts the customers in complete control. To stay agile, the entire stack is built on the cloud with modern technologies.
Purpose of Role :
- Develop and implement the collection analytics and strategy function for the credit cards. Use analysis and customer insights to develop optimum strategy.
CANDIDATE PROFILE :
- Successful candidates will have in-depth knowledge of statistical modelling/data analysis tools (Python, R etc.), techniques. They will be an adept communicator with good interpersonal skills to work with senior stake holders in India to grow revenue primarily through identifying / delivering / creating new, profitable analytics solutions.
We are looking for someone who:
- Proven track record in collection and risk analytics preferably in Indian BFSI industry. This is a must.
- Identify & deliver appropriate analytics solutions
- Experienced in Analytics team management
Essential Duties and Responsibilities :
- Responsible for delivering high quality analytical and value added services
- Responsible for automating insights and proactive actions on them to mitigate collection Risk.
- Work closely with the internal team members to deliver the solution
- Engage Business/Technical Consultants and delivery teams appropriately so that there is a shared understanding and agreement as to deliver proposed solution
- Use analysis and customer insights to develop value propositions for customers
- Maintain and enhance the suite of suitable analytics products.
- Actively seek to share knowledge within the team
- Share findings with peers from other teams and management where required
- Actively contribute to setting best practice processes.
Knowledge, Experience and Qualifications :
Knowledge :
- Good understanding of collection analytics preferably in Retail lending industry.
- Knowledge of statistical modelling/data analysis tools (Python, R etc.), techniques and market trends
- Knowledge of different modelling frameworks like Linear Regression, Logistic Regression, Multiple Regression, LOGIT, PROBIT, time- series modelling, CHAID, CART etc.
- Knowledge of Machine learning & AI algorithms such as Gradient Boost, KNN, etc.
- Understanding of decisioning and portfolio management in banking and financial services would be added advantage
- Understanding of credit bureau would be an added advantage
Experience :
- 4 to 8 years of work experience in core analytics function of a large bank / consulting firm.
- Experience on working on Collection analytics is must
- Experience on handling large data volumes using data analysis tools and generating good data insights
- Demonstrated ability to communicate ideas and analysis results effectively both verbally and in writing to technical and non-technical audiences
- Excellent communication, presentation and writing skills Strong interpersonal skills
- Motivated to meet and exceed stretch targets
- Ability to make the right judgments in the face of complexity and uncertainty
- Excellent relationship and networking skills across our different business and geographies
Qualifications :
- Masters degree in Statistics, Mathematics, Economics, Business Management or Engineering from a reputed college
About UpSolve
We built and deliver complex AI solutions which help drive business decisions faster and more accurately. We are a typical AI company and have a range of solutions developed on Video, Image and Text.
What you will do
- Stay informed on new technologies and implement cautiously
- Maintain necessary documentation for the project
- Fix the issues reported by application users
- Plan, build, and design solutions with a mental note of future requirements
- Coordinate with the development team to manage fixes, code changes, and merging
Location: Mumbai
Working Mode: Remote
What are we looking for
- Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field.
- Minimum 2 years of professional experience in software development, with a focus on machine learning and full stack development.
- Strong proficiency in Python programming language and its machine learning libraries such as TensorFlow, PyTorch, or scikit-learn.
- Experience in developing and deploying machine learning models in production environments.
- Proficiency in web development technologies including HTML, CSS, JavaScript, and front-end frameworks such as React, Angular, or Vue.js.
- Experience in designing and developing RESTful APIs and backend services using frameworks like Flask or Django.
- Knowledge of databases and SQL for data storage and retrieval.
- Familiarity with version control systems such as Git.
- Strong problem-solving and analytical skills.
- Excellent communication and collaboration abilities.
- Ability to work effectively in a fast-paced and dynamic team environment.
- Good to have Cloud Exposure
ROLE DESCRIPTION
The ideal candidate will be passionate about building resilient, scalable, and high-performance distributed systems products. This individual will thrive and succeed in delivering high-quality technology products in a fast-paced and rapid growth environment where priorities could shift quickly. We are looking for an engineer who prioritizes well, communicates clearly, and understands how to drive a high level of focus and excellence within a strong team. This person has an innate drive to build a culture centered on customer focus, efficient execution, high quality, rigorous testing, deep monitoring, and solid software engineering practices.
WHO WILL LOVE THIS JOB?
• Attracted to creativity, innovation, and eagerness to learn.
• Alignment to a fast-paced organization and its short-term and long-term goals.
• An engaging, open, genuine personality that naturally encourages interaction with individuals at all levels. • Strong value system and sense of ethics.
• Absolute dedication to premium quality.
• Want to build a strong core product team capable of developing solutions for complex, industry-first problems
• Build a balance of experience, knowledge and new learnings
ROLE & RESPONSIBILITIES
• Driving the success of the software engineering team at Datamotive.
• Driving go-nogo decisions of product releases to customers.
• Drive QE team for developing test scenarios & product automation.
• Collaborating with senior and peer engineers to identify and improve upon feature improvements.
• Build a strong customer focussed mindset for qualifying product features and use cases.
• Develop, Build & Perform functional, scale and performance testing.
• Assist in Identifying, Researching & Designing newer features and cloud platform support in areas of disaster recovery, data protection, workload migration etc.
• Conduct pilot tests to assess the functionality of newly developed programs.
• Front-facing customers for product introduction, knowledge transfer, solutions, bug triaging etc.
• Assist customers by giving product demos, conducting POCs, training etc.
• Manage Datamotive infrastructure, bringing innovative automation for optimizing infrastructure usage through monitoring and scripting.
• Design test environments to simulate customer behaviors and use cases in VMware vSphere, AWS, GCP, and Azure clouds.
• Help write technical documentation, and generate marketing content like blogs, webinars, seminars etc.
TECHNICAL SKILLS
• 8 - 12 years of experience in software testing with a relevant domain understanding of Data Protection, Disaster Recovery, and Ransomware Recovery.
• A strong understanding and demonstrable experience with at least one of the major public cloud platforms (GCP, AWS, Azure or VMware)
• A strong understanding and experience in qualifying complex, distributed systems at feature, scale and performance.
• Insights into the development of client-server SaaS applications with good breadth across networking, storage, micro-services, and other web technologies.
• Programming knowledge in either Python, Shell scripts or Powershell.
• Strong knowledge of test automation frameworks E.g. Selenium, Cucumber, and Robot frameworks
• Should be a computer science graduate with strong fundamentals & problem-solving abilities.
• Good understanding of virtualization, storage and cloud platforms like VMware, AWS, GCP, Azure and/or Kubernetes will be preferable.
WHAT’S IN IT FOR YOU?
• Impact. Backed by our TEAM, Investors and Advisors, Datamotive is on the path to rapid growth. As we take our products to the market, your position will be vital as you play a crucial role in innovating and developing our products, identifying new features, and filing patents, while also gaining personal experience and responsibilities. As a key player in our company's success, the impact of your work will be felt as we grow as an organization.
• Career Growth. At Datamotive, we highly value the input made by each employee to help us achieve our company goals. To this end, we strive to ensure that everyone has access, and exposure to be up-to-date in the industry, and to learn and improve their expertise. We ensure that each employee is given exposure to understanding the functional and technical elements of our products as well as all related business functions. As your knowledge grows, so do the opportunities for advancement to more senior opportunities or into other areas of our business. We strive to be a company where you can truly chart out a career path for yourself.
- Develop, train, and optimize machine learning models using Python, ML algorithms, deep learning frameworks (e.g., TensorFlow, PyTorch), and other relevant technologies.
- Implement MLOps best practices, including model deployment, monitoring, and versioning.
- Utilize Vertex AI, MLFlow, KubeFlow, TFX, and other relevant MLOps tools and frameworks to streamline the machine learning lifecycle.
- Collaborate with cross-functional teams to design and implement CI/CD pipelines for continuous integration and deployment using tools such as GitHub Actions, TeamCity, and similar platforms.
- Conduct research and stay up-to-date with the latest advancements in machine learning, deep learning, and MLOps technologies.
- Provide guidance and support to data scientists and software engineers on best practices for machine learning development and deployment.
- Assist in developing tooling strategies by evaluating various options, vendors, and product roadmaps to enhance the efficiency and effectiveness of our AI and data science initiatives.
WHO WILL LOVE THIS JOB?
• Attracted to creativity, innovation, and eagerness to learn
• Alignment to a fast-paced organization and its short-term and long-term goals
• An engaging, open, genuine personality that naturally encourages interaction with individuals at all levels
• Strong value system and sense of ethics
• Absolute dedication to premium quality
• Want to build strong core product team capable of developing solutions for complex, industry-first problems.
• Build balance of experience, knowledge, and new learnings
ROLES AND RESPONSIBILITIES?
• Driving the success of the software engineering team at Datamotive.
• Collaborating with senior and peer engineers to prioritize and deliver features on the roadmap.
• Build strong development team with focus on building optimized & usable solutions.
• Research, Design & Develop distributed solution to handle workload mobility across multi & hybrid clouds
• Assist in Identifying, Researching & Designing newer features and cloud platform support in areas of disaster recovery, data protection, workload migration etc.
• Assist in building product roadmap.
• Conduct pilot tests to assess the functionality of newly developed programs.
• Front facing customers for product introduction, knowledge transfer, solutioning, bugs triaging etc.
• Assist customers by giving product demos, conducting POCs, trainings etc.
• Manage Datamotive infrastructure, bring innovative automation for optimizing infrastructure usage through monitoring and scripting.
• Design test environments to simulate customer behaviours and use cases in VMware vSphere, AWS, GCP, Azure clouds.
• Help write technical documentation, generate marketing content like blogs, webinars, seminars etc.
TECHNICAL SKILLS
• 3 – 8 years of experience in software development with relevant domain understanding of Data Protection, Disaster Recovery, Ransomware Recovery.
• A strong understanding and demonstrable experience with at least one of the major public cloud platforms (GCP, AWS, Azure or VMware)
• A strong understanding and experience of designing and developing architecture of complex, distributed systems.
• Insights into development of client-server SaaS applications with good breadth across networking, storage, micro-services, and other web technologies.
• Experience of building and leading strong development teams with systems product development background
• Programming knowledge in either of GO Lang, C, C++, Python or Shell script.
• Should be a computer science graduate with strong fundamentals & problem-solving abilities.
• Good understanding of virtualization, storage and cloud platforms like VMware, AWS, GCP, Azure and/or Kubernetes will be preferable
About Us
Mindtickle provides a comprehensive, data-driven solution for sales readiness and enablement that fuels revenue growth and brand value for dozens of Fortune 500 and Global 2000 companies and hundreds of the world’s most recognized companies across technology, life sciences, financial services, manufacturing, and service sectors.
With purpose-built applications, proven methodologies, and best practices designed to drive effective sales onboarding and ongoing readiness, mindtickle enables company leaders and sellers to continually assess, diagnose and develop the knowledge, skills, and behaviors required to engage customers and drive growth effectively. We are funded by great investors, like – Softbank, Canaan partners, NEA, Accel Partners, and others.
Job Brief
We are looking for a rockstar researcher at the Center of Excellence for Machine Learning. You are responsible for thinking outside the box, crafting new algorithms, developing end-to-end artificial intelligence-based solutions, and rightly selecting the most appropriate architecture for the system(s), such that it suits the business needs, and achieves the desired results under given constraints.
Credibility:
- You must have a proven track record in research and development with adequate publication/patenting and/or academic credentials in data science.
- You have the ability to directly connect business problems to research problems along with the latest emerging technologies.
Strategic Responsibility:
- To perform the following: understanding problem statements, connecting the dots between high-level business statements and deep technology algorithms, crafting new systems and methods in the space of structured data mining, natural language processing, computer vision, speech technologies, robotics or Internet of things etc.
- To be responsible for end-to-end production level coding with data science and machine learning algorithms, unit and integration testing, deployment, optimization and fine-tuning of models on cloud, desktop, mobile or edge etc.
- To learn in a continuous mode, upgrade and upskill along with publishing novel articles in journals and conference proceedings and/or filing patents, and be involved in evangelism activities and ecosystem development etc.
- To share knowledge, mentor colleagues, partners, and customers, take sessions on artificial intelligence topics both online or in-person, participate in workshops, conferences, seminars/webinars as a speaker, instructor, demonstrator or jury member etc.
- To design and develop high-volume, low-latency applications for mission-critical systems and deliver high availability and performance.
- To collaborate within the product streams and team to bring best practices and leverage world-class tech stack.
- To set up every essentials (tracking / alerting) to make sure the infrastructure / software built is working as expected.
- To search, collect and clean Data for analysis and setting up efficient storage and retrieval pipelines.
Personality:
- Requires excellent communication skills – written, verbal, and presentation.
- You should be a team player.
- You should be positive towards problem-solving and have a very structured thought process to solve problems.
- You should be agile enough to learn new technology if needed.
Qualifications:
- B Tech / BS / BE / M Tech / MS / ME in CS or equivalent from Tier I / II or Top Tier Engineering Colleges and Universities.
- 6+ years of strong software (application or infrastructure) development experience and software engineering skills (Python, R, C, C++ / Java / Scala / Golang).
- Deep expertise and practical knowledge of operating systems, MySQL and NoSQL databases(Redis/couchbase/mongodb/ES or any graphDB).
- Good understanding of Machine Learning Algorithms, Linear Algebra and Statistics.
- Working knowledge of Amazon Web Services(AWS).
- Experience with Docker and Kubernetes will be a plus.
- Experience with Natural Language Processing, Recommendation Systems, or Search Engines.
Our Culture
As an organization, it’s our priority to create a highly engaging and rewarding workplace. We offer tons of awesome perks, great learning opportunities & growth.
Our culture reflects the globally diverse backgrounds of our employees along with our commitment to our customers, each other, and a passion for excellence.
To know more about us, feel free to go through these videos:
1. Sales Readiness Explained: https://www.youtube.com/watch?v=XyMJj9AlNww&t=6s
2. What We Do: https://www.youtube.com/watch?v=jv3Q2XgnkBY
3. Ready to Close More Deals, Faster: https://www.youtube.com/watch?v=nB0exreVU-s
To view more videos, please access the below-mentioned link:
https://www.youtube.com/c/mindtickle/videos
Mindtickle is proud to be an Equal Opportunity Employer
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected veteran status, or any other characteristic protected by law.
Your Right to Work - In compliance with applicable laws, all persons hired will be required to verify identity and eligibility to work in the respective work locations and to complete the required employment eligibility verification document form upon hire.
- Seeking an Individual carrying around 5+ yrs of experience.
- Must have skills - Jenkins, Groovy, Ansible, Shell Scripting, Python, Linux Admin
- Terraform, AWS deep knowledge to automate and provision EC2, EBS, SQL Server, cost optimization, CI/CD pipeline using Jenkins, Server less automation is plus.
- Excellent writing and communication skills in English. Enjoy writing crisp and understandable documentation
- Comfortable programming in one or more scripting languages
- Enjoys tinkering with tooling. Find easier ways to handle systems by doing some research. Strong awareness around build vs buy.
Experience: 4-8 years
Notice Period: 15-30 days
Mandatory Skill Set:
Front End: ReactJS / Javascript / CSS / jQuery / Bootstrap
Backend: Python /Django/ Flask / Tornado
Responsibilities :
- Responsible for design and architecture of functional prototypes and production ready systems
- Uses open source frameworks as appropriate. Django Preferred.
- Develops Python and JavaScript code as necessary.
- Co-ordinating with team lead / product team and contributing to business requirements in terms of code.
- Write Rest APIs and documentation to support consumption of these APIs.
- Communicate technical concepts with trade offs, risks, and benefits.
- Evaluate and resolve product related issues.
Requirements :
- Demonstrable experience writing clean, thoughtful and business oriented
- Strong understanding of JavaScript, HTML, and CSS3. Knowledge of ReactJS and Redux is a plus.
- Good understanding of REST API's and experience in building them. Knowledge of Django Rest Framework is a plus.
- Experience on asynchronous request handling, partial page updates, and AJAX.
- Proficient understanding of cross browser compatibility issues and ways to work around such issues
- Proficient understanding of code versioning tools, such as Git / Mercurial / SVN
- Proactive in terms of sharing updates across the entire team.
About Us -Celebal Technologies is a premier software services company in the field of Data Science, Big Data and Enterprise Cloud. Celebal Technologies helps you to discover the competitive advantage by employing intelligent data solutions using cutting-edge technology solutions that can bring massive value to your organization. The core offerings are around "Data to Intelligence", wherein we leverage data to extract intelligence and patterns thereby facilitating smarter and quicker decision making for clients. With Celebal Technologies, who understands the core value of modern analytics over the enterprise, we help the business in improving business intelligence and more data-driven in architecting solutions.
Key Responsibilities
• As a part of the DevOps team, you will be responsible for configuration, optimization, documentation, and support of the CI/CD components.
• Creating and managing build and release pipelines with Azure DevOps and Jenkins.
• Assist in planning and reviewing application architecture and design to promote an efficient deployment process.
• Troubleshoot server performance issues & handle the continuous integration system.
• Automate infrastructure provisioning using ARM Templates and Terraform.
• Monitor and Support deployment, Cloud-based and On-premises Infrastructure.
• Diagnose and develop root cause solutions for failures and performance issues in the production environment.
• Deploy and manage Infrastructure for production applications
• Configure security best practices for application and infrastructure
Essential Requirements
• Good hands-on experience with cloud platforms like Azure, AWS & GCP. (Preferably Azure)
• Strong knowledge of CI/CD principles.
• Strong work experience with CI/CD implementation tools like Azure DevOps, Team city, Octopus Deploy, AWS Code Deploy, and Jenkins.
• Experience of writing automation scripts with PowerShell, Bash, Python, etc.
• GitHub, JIRA, Confluence, and Continuous Integration (CI) system.
• Understanding of secure DevOps practices
Good to Have -
• Knowledge of scripting languages such as PowerShell, Bash
• Experience with project management and workflow tools such as Agile, Jira, Scrum/Kanban, etc.
• Experience with Build technologies and cloud services. (Jenkins, TeamCity, Azure DevOps, Bamboo, AWS Code Deploy)
• Strong communication skills and ability to explain protocol and processes with team and management.
• Must be able to handle multiple tasks and adapt to a constantly changing environment.
• Must have a good understanding of SDLC.
• Knowledge of Linux, Windows server, Monitoring tools, and Shell scripting.
• Self-motivated; demonstrating the ability to achieve in technologies with minimal supervision.
• Organized, flexible, and analytical ability to solve problems creatively.
Data Scientist-
We are looking for an experienced Data Scientists to join our engineering team and
help us enhance our mobile application with data. In this role, we're looking for
people who are passionate about developing ML/AI in various domains that solves
enterprise problems. We are keen on hiring someone who loves working in fast paced start-up environment and looking to solve some challenging engineering
problems.
As one of the earliest members in engineering, you will have the flexibility to design
the models and architecture from ground up. As any early-stage start-up, we expect
you to be comfortable wearing various hats, and be proactive contributor in building
something truly remarkable.
Responsibilities
Researches, develops and maintains machine learning and statistical models for
business requirements
Work across the spectrum of statistical modelling including supervised,
unsupervised, & deep learning techniques to apply the right level of solution to
the right problem Coordinate with different functional teams to monitor outcomes and refine/
improve the machine learning models Implements models to uncover patterns and predictions creating business value and innovation
Identify unexplored data opportunities for the business to unlock and maximize
the potential of digital data within the organization
Develop NLP concepts and algorithms to classify and summarize structured/unstructured text data
Qualifications
3+ years of experience solving complex business problems using machine
learning.
Fluency in programming languages such as Python, NLP and Bert, is a must
Strong analytical and critical thinking skills
Experience in building production quality models using state-of-the-art technologies
Familiarity with databases like MySQL, Oracle, SQL Server, NoSQL, etc. is
desirable Ability to collaborate on projects and work independently when required.
Previous experience in Fintech/payments domain is a bonus
You should have Bachelor’s or Master’s degree in Computer Science, Statistics
or Mathematics or another quantitative field from a top tier Institute
1. Should have worked in Agile methodology and microservices architecture
2. Should have 7+ years of experience in Python and Django framework
3. Should have a good knowledge of DRF
4. Should have knowledge of User Auth (JWT, OAuth2), API Auth, Access Control List, etc.
5. Should have working experience in session management in Django
6. Should have expertise in the Django MVC and uses of templates in frontend
7. Should have working experience in PostgreSQL
8. Should have working experience in the RabbitMQ messaging channel and Celery Analytics
9. Good to have javascript implementation knowledge in Django templates
About us:
Arista Networks was founded to pioneer and deliver software driven cloud networking solutions for large datacenter storage and computing environments. Arista's award-winning platforms, ranging in Ethernet speeds from 10 to 400 gigabits per second, redefine scalability, agility and resilience. Arista has shipped more than 20 million cloud networking ports worldwide with CloudVision and EOS, an advanced network operating system. Committed to open standards, Arista is a founding member of the 25/50GbE consortium. Arista Networks products are available worldwide directly and through partners.
About the job
Arista Networks is looking for world-class software engineers to join our Extensible Operating System (EOS) software development team.As a core member of the EOS team, you will be part of a fast-paced,high caliber team-building features to run the world's largest data center networks.Your software will be a key component of Arista's EOS, Arista's unique, Linux-based network operating system that runs on all of Arista's data center networking products.
The EOS team is responsible for all aspects of the development and delivery of software meant to run on the various Arista switches.You will work with your fellow engineers and members of the marketing team to gather and understand the functional and technical requirements for upcoming projects.You will help write functional specifications, design specifications, test plans, and the code to bring all of these to life.You will also work with customers to triage and fix problems in their networks. Internally, you will develop automated tests for your software, monitor the execution of those tests, and triage and fix problems found by your tests.At Arista, you will own your projects from definition to deployment, and you will be responsible for the quality of everything you deliver.
This role demands strong and broad software engineering fundamentals, and a good understanding of networking including capabilities like L2, L3, and fundamentals of commercial switching HW.Your role will not be limited to a single aspect of EOS at Arista, but cover all aspects of EOS.
Responsibilities:
- Write functional specifications and design specifications for features related to forwarding traffic on the internet and cloud data centers.
- Independently implement solutions to small-sized problems in our EOS software, using the C, C++, and python programming languages.
- Write test plan specifications for small-sized features in EOS, and implement automated test programs to execute the cases described in the test plan.
- Debug problems found by our automated test programs and fix the problems.
- Work on a team implementing, testing, and debugging solutions to larger routing protocol problems.
- Work with Customer Support Engineers to analyze problems in customer networks and provide fixes for those problems when needed in the form of new software releases or software patches.
- Work with the System Test Engineers to analyze problems found in their tests and provide fixes for those problems.
- Mentor new and junior engineers to bring them up to speed in Arista’s software development environment.
- Review and contribute to the specifications and implementations written by other team members.
- Help to create a schedule for the implementation and debugging tasks, update that schedule weekly, and report it to the project lead.
Qualifications:
- BS Computer Science/Electrical Engineering/Computer Engineering 3-10 years experience, or MS Computer Science/Electrical Engineering/Computer Engineering + 5 years experience, Ph.D. in Computer Science/Electrical Engineering/Computer Engineering, or equivalent work experience.
- Knowledge of C, C++, and/or python.
- Knowledge of UNIX or Linux.
- Understanding of L2/L3 networking including at least one of the following areas is desirable:
- IP routing protocols, such as RIP, OSPF, BGP, IS-IS, or PIM.
- Layer 2 features such as 802.1d bridging, the 802.1d Spanning Tree Protocol, the 802.1ax Link Aggregation Control Protocol, the 802.1AB Link Layer Discovery Protocol, or RFC 1812 IP routing.
- Ability to utilize, test, and debug packet forwarding engine and a hardware component’s vendor provided software libraries in your solutions.
- Infrastructure functions related to distributed systems such as messaging, signalling, databases, and command line interface techniques.
- Hands on experience in the design and development of ethernet bridging or routing related software or distributed systems software is desirable.
- Hands on experience with enterprise or service provider class Ethernet switch/router system software development, or significant PhD level research in the area of network routing and packet forwarding.
- Applied understanding of software engineering principles.
- Strong problem solving and software troubleshooting skills.
- Ability to design a solution to a small-sized problem, and implement that solution without outside help.Able to work on a small team solving a medium-sized problem with limited oversight.
Resources:
- Arista's Approach to Software with Ken Duda (CTO): https://youtu.be/TU8yNh5JCyw
- Additional information and resources can be found at https://www.arista.com/en/
Engineering Director
Experience: 9+ yrs.
Location: Pune (Hybrid)
We are seeking an Engineering Director, who operates with vision and integrity, and who will use innovative technologies that maximize productivity and help our company grow. Someone who plays an active role in development, identifying requirements and setting timelines and managing the project end to end. Exceptional leadership, communication, and project management skills.
Technology Skills: Java/ AWS/ Microservices/ Serverless architecture/ Cloud computing/Spring boot/Spring Cloud.
Roles and Responsibilities:
- Manage Engineering processes and workflows
- Use development methodologies to prioritize, estimate, and deliver projects with increasing efficiency by leading cross-functional team
- Define the strategic direction and contribute to the technical strategy.
- Provide clear and concise technical guidance to engineering teams
- Monitor reliability and performance of all internal systems to suggest improvements
- Extensive experience with cloud technologies
- Hands-on experience in back-end and front-end development
- Leadership abilities with a strategic mind
- Excellent project management skills
- Analytical skills for evaluating information carefully and solving complex problems
- Bachelor's degree in the engineering field.
About Intraedge: https://intraedge.com/
Intraedge is a Technology, Products and Learning Organization, It was founded in 2002 with offices in the US, India, Europe, Canada, and Singapore. We provide our clients with the resources and expertise to enhance business performance through technology.
About Fluidra: https://www.fluidra.com/
Fluidra, a multinational group listed on the Spanish Stock Exchange, is the global leader in developing innovative products and services in the global residential and commercial market.
About the company
A strong cross-functional team of designers, software developers, and hardware experts who love creating technology products and services. We are not just an outsourcing partner, but with our deep expertise across several business verticals, we bring our best practices so that your product journey is like a breeze.
We love healthcare, medical devices, finance, and consumer electronics but we love almost everything where we can build technology products and services. In the past, we have created several niche and novel concepts and products for our customers, and we believe we still learn every day to widen our horizons!
Introduction - Advanced Technology Group
As an extension to solving the continuous medical education needs of doctors through the courses platform, Iksha Labs also developed several cutting-edge solutions for simulated training and education, including
- Virtual Reality and Augmented Reality based surgical simulations
- Hand and face-tracking-based simulations
- Remote immersive and collaborative training through Virtual Reality
- Machine learning-based auto-detection of clinical conditions from medical images
Introduction - Advanced Technology Group
As an extension to solving the continuous medical education needs of doctors through the courses platform, Iksha Labs developed several cutting-edge solutions for simulated training and education, including
- Virtual Reality and Augmented Reality based surgical simulations
- Hand and face-tracking-based simulations
- Remote immersive and collaborative training through Virtual Reality
- Machine learning-based auto-detection of clinical conditions from medical images
Job Description
The ideal candidate will be responsible for developing high-quality applications. They will also be responsible for designing and implementing testable and scalable code.
Key Skills/Technology
- Good command of C, and C++ with Algorithms and Data Structures
- Image Processing
- Qt (Expertise)
- Python (Expertise)
- Embedded Systems
- Good working knowledge of STL/Boost Algorithms and Data structures
Responsibilities
- Develop quality software and web applications
- Analyze and maintain existing software applications
- Develop scalable, testable code
- Discover and fix programming bugs
Qualifications
Bachelor's degree or equivalent experience in Computer Science/Electronics and Communication or a related field.
Industry Type
Medical / Healthcare
Functional Area
IT Software - Application Programming, Maintenance
Avegen is a digital healthcare company empowering individuals to take control of their health and supporting healthcare professionals in delivering life-changing care. Avegen’s core product, HealthMachine®, is a cloud-hosted, next-generation digital healthcare engine for pioneers in digital healthcare, including healthcare providers and pharmaceutical companies, to deploy high-quality robust digital care solutions efficiently and effectively. We are ISO27001, ISO13485, and Cyber essentials certified; compliant with NHS Data protection toolkit and GDPR.
Job Summary:
We are looking for a Mobile Automation Tester who is passionate in Mobile App Automation and works in one or more mobile automation frameworks.
Roles and Responsibilities :
- Write, design, and execute automated tests by creating scripts that run testing functions automatically.
- Build test automation frameworks.
- Work in an agile development environment where developers and testers work closely together to ensure requirements are met.
- Design, document, manage and execute test cases, sets, and suites.
- Work in cross-functional project teams that include Development, Marketing, Usability, Software Quality Assurance, Customer Learning, and Support.
- Review test cases and automate whenever possible. -Educate team members on test automation and drive adoption.
- Integrate automated test cases into nightly build systems.
Required Skills:
- Previous experience working as a QA automation engineer.
- Experience in Mobile Testing. IOS automation and Android automation.
- Hands-on experience in any programming language like Java, python, javascript, Ruby, C#.
- Experience & knowledge of tools like JIRA, Selenium , Postman, Web and App test automation.
- Ability to deliver results under pressure.
- Self-development skills to keep up to date with fast-changing trends.
Good to Have Skills:
- Experience working with CI/CD pipelines like (Jenkins, Circle CI).
- API, DB Automation.
- Excellent scripting experience.
Educational Qualifications:
● Candidates with Bachelor / Master's degree would be preferred
Job Responsibilities:
Support, maintain, and enhance existing and new product functionality for trading software in a real-time, multi-threaded, multi-tier server architecture environment to create high and low level design for concurrent high throughput, low latency software architecture.
- Provide software development plans that meet future needs of clients and markets
- Evolve the new software platform and architecture by introducing new components and integrating them with existing ones
- Perform memory, cpu and resource management
- Analyze stack traces, memory profiles and production incident reports from traders and support teams
- Propose fixes, and enhancements to existing trading systems
- Adhere to release and sprint planning with the Quality Assurance Group and Project Management
- Work on a team building new solutions based on requirements and features
- Attend and participate in daily scrum meetings
Required Skills:
- JavaScript and Python
- Multi-threaded browser and server applications
- Amazon Web Services (AWS)
- REST
Job Role - HTML/CSS Developer with Jinja Support
Location - Pune
Experience - 1+ Years
Job Description:
We are seeking a highly skilled HTML/CSS developer to join our team at a SaaS startup. The ideal candidate will have experience with HTML, CSS, and Jinja, as well as a strong understanding of web development best practices.
Responsibilities:
- Develop and maintain web pages and web applications using HTML, CSS, and Jinja
- Collaborate with the development team to design and implement new features
- Write clean, maintainable, and efficient code
- Stay up-to-date with the latest web development trends and technologies
- Troubleshoot and debug any issues that arise
- Test and optimize web pages for maximum speed and scalability
Qualifications:
- Strong experience with HTML, CSS, and Jinja
- Experience with web development frameworks such as Flask or Django
- Strong understanding of web development best practices
- Familiarity with version control systems such as Git
- Strong problem-solving and debugging skills
- Excellent communication and teamwork abilities
This is a full-time position with a competitive salary and benefits, and the opportunity to work with a talented and passionate team in a rapidly growing startup. If you are passionate about web development and want to make a real impact in a dynamic and innovative company, we would love to hear from you.
About CrelioHealth:
CrelioHealth (formerly LiveHealth) is an IT product company in the Health care domain. We are an almost decade-old IT product organisation.
We are a flourishing, Open & Flexi culture organisation with a young team.
We are a group of young enthusiasts passionate about building the best line of products in healthcare diagnostics. Our product is LIMS & CRM used for Pathology Labs & Hospitals.
Our Product -
- CrelioHealth LIMS - Web-based LIMS (Laboratory Information Management System) and RIS (Radiology Information System) solution for automating your processes & managing the business better
- CrelioHealth CRM- Patient booking and engagement tool to take patient experience to the next level.
- CrelioHealth Inventory - Online platform to manage your lab inventory, stock, and purchases
Org link - https://creliohealth.in/
We are voted as #14 rank in G2’s List of Best Software Sellers for 2021.CrelioHealth (formerly LiveHealth) is a cloud-based LIS and RIS solution that enables Laboratory staff, doctors, and patients to access and manage medical information using the same platform easily.
Find out more at https://creliohealth.com/ or get updates on https://blog.creliohealth.in CrelioHealth for Diagnostics
Blog - CrelioHealth for Diagnostics
We are looking for a Quantitative Developer who is passionate about financial markets and wants to join a scale-up with an excellent track record and growth potential in an innovative and fast-growing industry.
As a Quantitative Developer, you will be working on the infrastructure of our platform,as part of a very ambitious team.
At QCAlpha you have the freedom to choose the path that leads to the solution and get a lot of responsibility.
Responsibilities
• Design, develop, test, and deploy elegant software solutions for automated trading systems
• Building high-performance, bullet-proof components for both live trading and simulation
• Responsible for technology infrastructure systems development, which includes connectivity, maintenance, and internal automation processes
• Achieving trading system robustness through automated reconciliation and system-wide alerts
Requirements
• Bachelor’s degree or higher in computer science or other quantitative discipline
• Strong fundamental knowledge of OOP programming, algorithms, data structures and design patterns.
• Familiar with the following technology stacks: Linux shell, Python and its ecosystem, NumPy, Pandas, SQL, Redis, Docker or similar system
• Experience in python frameworks such as Django or Flask.
• Solid understanding of git, ci/cd.
• Excellent design, debugging and problem-solving skills.
• Proven versatility and ability to pick up new technologies and learn systems quickly.
• Trading Execution development and support experience is a plus.
We are looking for curious & inquisitive technology practitioners. Our customers see us one of the most premium advisory and development services firm, hence most of the problems we work on are complex and often hard to solve. You can expect to work in small (2-5) people teams, working very closely with the customers in iterative developing and evolving the solution. We are continually on the search for passionate, bright and energetic professionals to join our team.
So, if you are someone who has strong fundamentals on technology and wants to stretch, beyond the regular role based boundaries, then Sahaj is the place for you. You will experience a world, where there are no roles or grades and you will play different roles and wear multiple hats, to deliver a software project.
Responsibilities
- Work on complex, custom-designed, scalable, multi-tiered software development projects
- Work closely with clients (commercial & social enterprises, start ups), both Business and Technical staff members *
- Be responsible for the quality of software and resolving any issues regards the solution
- Think through hard problems, not limited to technology and work with a team to realise and implement solutions
- Learn something new everyday
Requirements
- Development and delivery experience in any of the programming languages
- Passion for software engineering and craftsman-like coding prowess
- Great design and solutioning skills (OO & Functional)
- Experience including analysis, design, coding and implementation of large scale custom built object-oriented applications
- Understanding of code refactoring and optimisation issues
- Understanding of Virtualisation & DevOps.
- Experience with Ansible, Chef, Docker preferable *
- Ability to learn new technologies and adapt to different situations
- Ability to handle ambiguity on a day to day basis
- Skills: J2EE, Spring Boot, Hibernate (Java), Java and Scala
Desired Skills and Experience
- NET,Golang,Java,Node.js,Python,Ruby,Scala,Hibernate,J2EE,Ruby on Rails,Spring
Enterprise Minds, with core focus on engineering products, automation and intelligence, partners customers on the trajectory towards increasing outcomes, relevance, and growth.
Harnessing the power of Data and the forces that define AI, Machine Learning and Data Science, we believe in institutionalizing go-to-market models and not just explore possibilities.
We believe in a customer-centric ethic without and people-centric paradigm within. With a strong sense of community, ownership, and collaboration our people work in a spirit of co-creation, co-innovation and co-development to engineer next-generation software products with the help of accelerators.
Through Communities we connect and attract talent that shares skills and expertise. Through Innovation Labs and global design studios we deliver creative solutions.
We create vertical isolated pods which has narrow but deep focus. We also create horizontal pods to collaborate and deliver sustainable outcomes.
We follow Agile methodologies to fail fast and deliver scalable and modular solutions. We constantly self-asses and realign to work with each customer in the most impactful manner.
Pre-requisites for the Role
1.Job ID-EMSP0120PS
- Primary skill:
- Splunk Development and Administration
- Secondary skill:
Python, Splunk DB connect, Visual Studio (C#), BitBucket, Kafka, Devops tools.
- 4. Years of Experience: 5-8 Years
- Location:(Hybrid Model)-Pune
- Position-1
- Budget- - 5-6 years (Max up to 17 LPA) and 6-8 Years (Max up to 22 LPA)
- NP- Immediate
Primary Role & Responsibility:
As a software engineer, your daily work involves technically challenging applications and projects where your code makes a direct contribution to the further development and upkeep of our software suite and to its application in projects.
You should be able to create Splunk dashboards, apps and should have good understanding of source interfaces for Splunk.
You should have idea of onboarding of data from different sources like JSON, XML, syslog, errorlog files.
As a software engineer, we expect much more from you than just the ability to design and develop good software. We find it important that you possess an inherent drive to get the best out of yourself every day, that you are inquisitive and that you are not intimidated by situations which require you to branch off from the beaten track. You work together with colleagues in a SCRUM team. In addition, you have regular contact with other software teams, software architects, testers and end users. Good communication skills are therefore extremely important, as well as the ability to think pro-actively and suggest possible improvements. This gives you every opportunity to contribute your personal input and grow and develop within the department.
The often complex functionality of the software includes business logic, controls for logistical transport, communication with external computer systems, reporting, data analysis and simulation. This functionality is spread across various components. You design, program and test the software based on a design concept and a set of requirements. In some cases, you will have to personally formulate these requirements together with the (end) users and / or internal stakeholders. Learn more about the Software modular stack
Desired Profile & Experience: Knowledge of Kafka and experience with Java
- Splunk Architecture, on-premise and cloud based deployment.
- IoT edge.
- Analytical skills and capabilities to understand how raw (unstructured) data needs to be transformed into processed information.
Graas uses predictive AI to turbo-charge growth for eCommerce businesses. We are “Growth-as-a-Service”. Graas is a technology solution provider using predictive AI to turbo-charge growth for eCommerce businesses. Graas integrates traditional data silos and applies a machine-learning AI engine, acting as an in-house data scientist to predict trends and give real-time insights and actionable recommendations for brands. The platform can also turn insights into action by seamlessly executing these recommendations across marketplace store fronts, brand.coms, social and conversational commerce, performance marketing, inventory management, warehousing, and last mile logistics - all of which impacts a brand’s bottom line, driving profitable growth.
Roles & Responsibilities:
Work on implementation of real-time and batch data pipelines for disparate data sources.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies.
- Build and maintain an analytics layer that utilizes the underlying data to generate dashboards and provide actionable insights.
- Identify improvement areas in the current data system and implement optimizations.
- Work on specific areas of data governance including metadata management and data quality management.
- Participate in discussions with Product Management and Business stakeholders to understand functional requirements and interact with other cross-functional teams as needed to develop, test, and release features.
- Develop Proof-of-Concepts to validate new technology solutions or advancements.
- Work in an Agile Scrum team and help with planning, scoping and creation of technical solutions for the new product capabilities, through to continuous delivery to production.
- Work on building intelligent systems using various AI/ML algorithms.
Desired Experience/Skill:
- Must have worked on Analytics Applications involving Data Lakes, Data Warehouses and Reporting Implementations.
- Experience with private and public cloud architectures with pros/cons.
- Ability to write robust code in Python and SQL for data processing. Experience in libraries such as Pandas is a must; knowledge of one of the frameworks such as Django or Flask is a plus.
- Experience in implementing data processing pipelines using AWS services: Kinesis, Lambda, Redshift/Snowflake, RDS.
- Knowledge of Kafka, Redis is preferred
- Experience on design and implementation of real-time and batch pipelines. Knowledge of Airflow is preferred.
- Familiarity with machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn)
• Work with various stakeholders, understand requirements, and build solutions/data pipelines
that address the needs at scale
• Bring key workloads to the clients’ Snowflake environment using scalable, reusable data
ingestion and processing frameworks to transform a variety of datasets
• Apply best practices for Snowflake architecture, ELT and data models
Skills - 50% of below:
• A passion for all things data; understanding how to work with it at scale, and more importantly,
knowing how to get the most out of it
• Good understanding of native Snowflake capabilities like data ingestion, data sharing, zero-copy
cloning, tasks, Snowpipe etc
• Expertise in data modeling, with a good understanding of modeling approaches like Star
schema and/or Data Vault
• Experience in automating deployments
• Experience writing code in Python, Scala or Java or PHP
• Experience in ETL/ELT either via a code-first approach or using low-code tools like AWS Glue,
Appflow, Informatica, Talend, Matillion, Fivetran etc
• Experience in one or more of the AWS especially in relation to integration with Snowflake
• Familiarity with data visualization tools like Tableau or PowerBI or Domo or any similar tool
• Experience with Data Virtualization tools like Trino, Starburst, Denodo, Data Virtuality, Dremio
etc.
• Certified SnowPro Advanced: Data Engineer is a must.
Job Summary
As a Data Science Lead, you will manage multiple consulting projects of varying complexity and ensure on-time and on-budget delivery for clients. You will lead a team of data scientists and collaborate across cross-functional groups, while contributing to new business development, supporting strategic business decisions and maintaining & strengthening client base
- Work with team to define business requirements, come up with analytical solution and deliver the solution with specific focus on Big Picture to drive robustness of the solution
- Work with teams of smart collaborators. Be responsible for their appraisals and career development.
- Participate and lead executive presentations with client leadership stakeholders.
- Be part of an inclusive and open environment. A culture where making mistakes and learning from them is part of life
- See how your work contributes to building an organization and be able to drive Org level initiatives that will challenge and grow your capabilities.
Role & Responsibilities
- Serve as expert in Data Science, build framework to develop Production level DS/AI models.
- Apply AI research and ML models to accelerate business innovation and solve impactful business problems for our clients.
- Lead multiple teams across clients ensuring quality and timely outcomes on all projects.
- Lead and manage the onsite-offshore relation, at the same time adding value to the client.
- Partner with business and technical stakeholders to translate challenging business problems into state-of-the-art data science solutions.
- Build a winning team focused on client success. Help team members build lasting career in data science and create a constant learning/development environment.
- Present results, insights, and recommendations to senior management with an emphasis on the business impact.
- Build engaging rapport with client leadership through relevant conversations and genuine business recommendations that impact the growth and profitability of the organization.
- Lead or contribute to org level initiatives to build the Tredence of tomorrow.
Qualification & Experience
- Bachelor's /Master's /PhD degree in a quantitative field (CS, Machine learning, Mathematics, Statistics, Data Science) or equivalent experience.
- 6-10+ years of experience in data science, building hands-on ML models
- Expertise in ML – Regression, Classification, Clustering, Time Series Modeling, Graph Network, Recommender System, Bayesian modeling, Deep learning, Computer Vision, NLP/NLU, Reinforcement learning, Federated Learning, Meta Learning.
- Proficient in some or all of the following techniques: Linear & Logistic Regression, Decision Trees, Random Forests, K-Nearest Neighbors, Support Vector Machines ANOVA , Principal Component Analysis, Gradient Boosted Trees, ANN, CNN, RNN, Transformers.
- Knowledge of programming languages SQL, Python/ R, Spark.
- Expertise in ML frameworks and libraries (TensorFlow, Keras, PyTorch).
- Experience with cloud computing services (AWS, GCP or Azure)
- Expert in Statistical Modelling & Algorithms E.g. Hypothesis testing, Sample size estimation, A/B testing
- Knowledge in Mathematical programming – Linear Programming, Mixed Integer Programming etc , Stochastic Modelling – Markov chains, Monte Carlo, Stochastic Simulation, Queuing Models.
- Experience with Optimization Solvers (Gurobi, Cplex) and Algebraic programming Languages(PulP)
- Knowledge in GPU code optimization, Spark MLlib Optimization.
- Familiarity to deploy and monitor ML models in production, delivering data products to end-users.
- Experience with ML CI/CD pipelines.
Front End developers
- Angular.JS experience
- MongoDB query and aggregation experience (not a database administrator)
- GraphQL experience
- Node.JS and Typescript experience
- CSS and SCSS experience
- CI/CD experience with GitHub actions
Backend Developers:
- Software development experience, one of Python (preferred) or Node.JS/Typescript)
- Experience with Messaging architectures - RabbitMQ (preference) or Kafka
- Experience with docker-containers
- Experience with Apache NiFi (valued but not necessary)
- Experience with designing or implementing horizontally scalable solutions
- Experience working with RESTful APIs
- CI/CD experience with GitHub actions
- Experience with Azure cloud
.
Technobrilliant learning solutions is hiring IT Trainers for their Training Institute.
Trainers for the profiles of :
1) Software testing
2) Full stack Java
3) Full stack python
4) Salesforce
Qualification: Graduate
Location:Shivaji nagar, Pune & KK market, Katraj, Pune.
Mode: offline
Type :Full-time / Part-time
Salary: 20-40K
Vacancy for Teaching profile at IT training institute.
Technobrilliant Learning solutions is hiring an "IT faculty member" for IT Training & Skill Development center.
Position available:
1) FULLSTACK JAVA/PYTHON FACULTY
Mode of job: offline( onsite)
Location: Shivaji Nagar, Pune & KK Market (Katraj) Pune,
Experience : 2+ years for software testing & full-stack
5+ years for Salesforce.
Type: Full-time & Part-time
Salary: Depends on experience & interview.
Website: www.technobrilliant.com
- 3-8+ years of experience programming in a backend language (Java / Python), with a good understanding of troubleshooting errors.
- 5+ years of experience in Confluent Kafka / 3+ years of experience in Confluent Kafka
- Cloud Kafka, Control Central, Rest Proxy, HA Proxy, Confluent Kafka Connect, Confluent Kafka Security features
at Altimetrik
Location: Chennai, Pune,Banglore,jaipurExp: 5 yrs to 8 yrs
- Implement best practices for the engineering team across code hygiene, overall architecture design, testing, and deployment activities
- Drive technical decisions for building data pipelines, data lakes, and analyst access.
- Act as a leader within the engineering team, providing support and mentorship for teammates across functions
- Bachelor’s Degree in Computer Science or equivalent job experience
- Experienced developer in large data environments
- Experience using Git productively in a team environment
- Experience with Docker
- Experience with Amazon Web Services
- Ability to sit with business or technical SMEs to listen, learn and propose technical solutions to business problems
· Experience using and adapting to new technologies
· Take and understand business requirements and goals
· Work collaboratively with project managers and stakeholders to make sure that all aspects of the project are delivered as planned
· Strong SQL skills with MySQL or PostgreSQL
- Experience with non-relational databases and their role in web architectures desired
Knowledge and Experience:
- Good experience with Elixir and functional programming a plus
- Several years of python experience
- Excellent analytical and problem-solving skills
- Excellent organizational skills
Proven verbal and written cross-department and customer communication skills
Want to work with an established & growing IT company? Join team Benison to have the right challenges that will help you accelerate your career growth to the next level, faster!
Benison Technologies was started in 2011 with a mission to revolutionize the silicon industry in India, with a host of amazing big clients like Google, Cisco, McAfee, Intel, and so on, you get to experience the best of both worlds. If you consider yourself an engineer who is capable to join our ever-growing team, then this is the right opportunity for you:
Why Benison Tech?
We have a partial acquisition from one of the biggest names in the world (well we can’t name them thanks to confidentiality) it’s one of the FAANG companies, and you can “Google” it if you like.
Oh! & one more thing, this did not happen by accident, our team put a ton of efforts to turn this gigantic dream into a reality.
Benison Tech has a consistent history of demonstrating growth through innovation time and again.
We don’t stop there, we then re-invest our profits back into the initiatives for the growth of our people, our culture and the company. Now enough with us, let’s talk about the job roles & responsibilities:
What you will be working on:
- Key contributor for developing product strategies and features.
- Software development for industries leading SaaS platform
- You will be involved closely in planning, designing, integration of client requirements.
- You will be working with one of the leaders in data resiliency and data protection.
Here are some technical skills require:
- Independently own features and create feature test plans/strategies based on development and feature completion milestones.
- Identify quality assurance process bottlenecks and suggest actions for improvement.
- Design automation framework for automating feature tests.
- Participate in test cases, test plans, s and code reviews.
- Resolve functional queries coming from other business units such as support, escalation, product management, etc.
- Participate in bug trailing, tracking quality assurance metrics.
- Hands-on experience with Python-Selenium or Cypress, will be preferred.
- Familiarity with Test Management systems like XRay and bug tracker like JIRA tools.
What we expect from you:
- 3-10 Years of relevant experience in QA Automation.
- Expert at test automation, creating test plans, test strategies for testing multiple product modules
- Should be able to quickly analyze failures and trace back to issues in the product or the automation suite.
- As a Software Development Engineer in Test you should be an expert at test automation for APIs as well as UI, creating test plans and test strategies for testing product features.
- You will guide and mentor junior team members by reviewing their automation code and test cases to ensure good coverage and quality of a feature
- Resolve functional queries coming from other business units such as support, escalation, product management, etc.
- Be a quick learner and be open to working on new technologies if needed.
- Excellent team player with strong verbal & written communication skills.
- Be able to step up when the situation demands such as meeting deadlines and critical production issues.
- Propose changes or enhancements to the framework for enabling new feature tests.
Few Skills which will add brownie points to your role
- Working knowledge of Dockers and Kubernetes will be an advantage
- Awareness of general manual and automation concepts and all types of testing methods
- Knowledge of the Backup or Storage domain will be an advantage.
If the above fits your skill-sets and tickles your interest then read below about the additional benefits that our company offers to talented folks like you:
Work Culture and Benefits
- Competitive salary and benefits package
(H1-B which means a chance to work onsite out of India) - A culture focused on talent development where you get promoted within the quarterly cycle of your anniversary.
- Opportunity to work with cutting-edge & challenging technologies including legacy tech.
- Open cafeteria to grab some munchies while you work, we make sure the space feels like your second home, you can also wear pyjamas if you like.
- Employee engagement initiatives such as project parties, flexible work hours, and long service awards, team bonding activities within the company, extra learning and personal development trainings, because why stop your learning at one thing!
- Insurance coverage: Group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and your parents. (With some of the best insurance partners in India)
- Enjoy collaborative innovation (each members gets to innovate & think out of the box), along with highly experienced team managers who maintain diversity and work-life well-being.
- And of course, you get to work on projects from some of the most recognised brands within the networking and security space of the world, unlocking global opportunities to learn, grow & contribute in a way that is truly impactful yet purposeful at the same time.
Still not satisfied, and want more proof?
Head to our website https://benisontech.com/">https://benisontech.com to learn more.
Want to work with an established & growing IT company? Join team Benison to have the right challenges that will help you accelerate your career growth to the next level, faster!
Benison Technologies was started in 2011 with a mission to revolutionize the silicon industry in India, with a host of amazing big clients like Google, Cisco, McAfee, Intel, and so on, you get to experience the best of both worlds. If you consider yourself an engineer who is capable to join our ever-growing team, then this is the right opportunity for you:
Why Benison Tech?
We have a partial acquisition from one of the biggest names in the world (well we can’t name them thanks to confidentiality) it’s one of the FAANG companies, and you can “Google” it if you like.
Oh! & one more thing, this did not happen by accident, our team put a ton of efforts to turn this gigantic dream into a reality.
Benison Tech has a consistent history of demonstrating growth through innovation time and again.
We don’t stop there, we then re-invest our profits back into the initiatives for the growth of our people, our culture and the company. Now enough with us, let’s talk about the job roles & responsibilities:
What you will be working on:
- Key contributor for developing product strategies and features.
- Software development for industries leading SaaS platform
- You will be involved closely in planning, designing, integration of client requirements.
- You will be working with one of the leaders in data resiliency and data protection.
Here are some technical skills require:
- Independently own features and create feature test plans/strategies based on development and feature completion milestones.
- Identify quality assurance process bottlenecks and suggest actions for improvement.
- Design automation framework for automating feature tests.
- Participate in test cases, test plans, s and code reviews.
- Resolve functional queries coming from other business units such as support, escalation, product management, etc.
- Participate in bug trailing, tracking quality assurance metrics.
- Hands-on experience with Python-Selenium or Cypress, will be preferred.
- Familiarity with Test Management systems like XRay and bug tracker like JIRA tools.
What we expect from you:
- 3-10 Years of relevant experience in QA Automation.
- Expert at test automation, creating test plans, test strategies for testing multiple product modules
- Should be able to quickly analyze failures and trace back to issues in the product or the automation suite.
- As a Software Development Engineer in Test you should be an expert at test automation for APIs as well as UI, creating test plans and test strategies for testing product features.
- You will guide and mentor junior team members by reviewing their automation code and test cases to ensure good coverage and quality of a feature
- Resolve functional queries coming from other business units such as support, escalation, product management, etc.
- Be a quick learner and be open to working on new technologies if needed.
- Excellent team player with strong verbal & written communication skills.
- Be able to step up when the situation demands such as meeting deadlines and critical production issues.
- Propose changes or enhancements to the framework for enabling new feature tests.
Few Skills which will add brownie points to your role
- Working knowledge of Dockers and Kubernetes will be an advantage
- Awareness of general manual and automation concepts and all types of testing methods
- Knowledge of the Backup or Storage domain will be an advantage.
If the above fits your skill-sets and tickles your interest then read below about the additional benefits that our company offers to talented folks like you:
Work Culture and Benefits
- Competitive salary and benefits package
(H1-B which means a chance to work onsite out of India) - A culture focused on talent development where you get promoted within the quarterly cycle of your anniversary.
- Opportunity to work with cutting-edge & challenging technologies including legacy tech.
- Open cafeteria to grab some munchies while you work, we make sure the space feels like your second home, you can also wear pyjamas if you like.
- Employee engagement initiatives such as project parties, flexible work hours, and long service awards, team bonding activities within the company, extra learning and personal development trainings, because why stop your learning at one thing!
- Insurance coverage: Group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and your parents. (With some of the best insurance partners in India)
- Enjoy collaborative innovation (each members gets to innovate & think out of the box), along with highly experienced team managers who maintain diversity and work-life well-being.
- And of course, you get to work on projects from some of the most recognised brands within the networking and security space of the world, unlocking global opportunities to learn, grow & contribute in a way that is truly impactful yet purposeful at the same time.
Still not satisfied, and want more proof?
Head to our website https://benisontech.com/">https://benisontech.com to learn more.
Job Role – Principal Software Engineer (Back End)
About Peppermint: Peppermint is an award winning robotics company, supported by SINE IIT-Bombay and Qualcomm. Peppermint develops and deploys Industrial and Enterprise robots for mobility led services. Robots built on Peppermint Platform are deployed across 4 countries
What to expect :
At Peppermint Robots, we rely on our dynamic team of engineers to solve the many challenges and puzzles that come with our rapidly evolving technical stack.
We’re seeking an experienced Principal Software Engineer-Back End to lead our Software development and architecture as part of the Engineering team.
Here, you will take complete, end-to-end ownership of back-end projects across the entire software stack, and lead the Software Team to deliver impactful projects.
Our ideal candidate has experience building products across the stack and a firm understanding of Web Development, APIs, Python (Flask or Django knowledge is a bonus), databases (NoSQL knowledge is a bonus). Frontend Development or Mobile Development experience is a valued bonus. You’ll be joining and leading a team working at the forefront of new technology, solving the challenges that impact both the front-end and back-end architecture, and ultimately, delivering amazing global user experiences.
What we offer:
• Freedom to prototype with ideas and experimental technologies and incorporate into the production stack
• Rapidly accelerate your career progression
• Lead architecture and design of products with global reach (self-starter attitude and ability to take decisions is a highly valued trait at Peppermint)
• Do novel innovative work, build features from scratch, dive deep into nuances of individual technical domain interest
• Lead and build a high-performance result-oriented team
• Be proud of the work done and have your contributions recognized and valued across the organization
Role Responsibilities :
• Lead the Software Team. Ensure consistent deliveries of planned features while ensuring code quality, testing standards, and Peppermint Processes are maintained.
• Work with the leadership team to cultivate and grow the Internal Software Team Culture at Peppermint
• Contribute to the Stack Backend as one of the primary backend developers of your team
• Exert influence on the overall objectives and long-range goals of your team.
• Help to define and improve our internal standards for style, maintainability, and best practices for a high-scale web environment as well as all other Software Products.
Daily and Monthly Responsibilities
• Participate in all aspects of Agile software development including design, development, and deployment
• Lead the Software Team ensuring Agile Methodologies are followed, Peppermint Culture is respected, and manage day-to-day activities of the software team to ensure deliverables are accomplished on time and meeting requirements
• Architect and provide guidance on building end-to-end systems optimized for speed and scale
• Work closely with the Robotics and Electronics Teams to deliver powerful software tools and platforms
• Monthly Review, KPI Setting and MIS to Engineering leadership members.
We are looking for :
• 4-9 years of experience building large-scale software applications and working with large Software Teams
• Bachelor’s degree in computer science, information technology, or engineering
• Experience designing and integrating RESTful APIs
• Knowledge of Python and Backend Development
• Experience building Web/Mobile applications
• Excellent debugging and optimization skills
• Unit and Integration testing experience
• Being knowledgeable about engineering processes and good practices
• Passionate about learning new tools. Ability to continuously learn and acquire knowledge.
• Able to adapt to changing complexity of tasks.
Work Culture : We are a process and speed-thinking led team, with domain experts working together to build world class robots. We care for intent, attitude and collaboration over just results and proof of work. Our culture stands for authentic stance, loud-and-clear communication, no hesitation and emphasis to “ask” anything! We care deeply for every team member’s career journey and the culture which propels it forwards. We do expect you to appreciate the underlying purpose at Peppermint and get going!
Job Type: Full-time
- Data Engineer
Required skill set: AWS GLUE, AWS LAMBDA, AWS SNS/SQS, AWS ATHENA, SPARK, SNOWFLAKE, PYTHON
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
- AWS certification
- Spark Streaming
- Kafka Streaming / Kafka Connect
- ELK Stack
- Cassandra / MongoDB
- CI/CD: Jenkins, GitLab, Jira, Confluence other related tools