Job Description:
Responsibilities:
- Completing all tasks set by the supervisor and assisting wherever possible.
- Observing existing strategies, techniques of coding, debugging, testing and adopting to the same
- Ability to maintain composure under pressure
- Ability to work in a team.
- Good observation skills and a willingness to learn.
Skills:
- Proficiency in data structures and algorithms
- Good problem solving and analytical thinking skills
- Knowledge of Linux systems
- Python coding knowledge
- Knowledge of object-oriented programming
- Good verbal and written communication skills.
Requisition Raised by:
Engineering Director
About Monarch Tractors India
The Monarch Mission
"It is difficult to go green, when you are in the red."™
Monarch Tractor is a mission driven company that is committed to elevating farming practices to enable clean, efficient, and economically viable solutions for today’s farmers and the generations of farmers to come.
We are collaborative partners with a farmer first approach to innovation.
✔️100% Electric
✔️Driver Optional
✔️Data-Driven
Similar jobs
1. Develop backend for applications in ecommerce/insurance/wealth management businesses
2. Design technically sound systems and deliver results in a fast manner
3. Building highly performant applications setting top standards in respective industries
Basic qualifications :
1. 2-4 years experience building highly performant applications in Python
2. Expertise in Python frameworks like Django
3. Familiarity with Rest of API
5. Good grasp on data structures and proficiency in problem-solving
6. Knowledge of design patterns
Anyone interested send me the resume.
• As a Python full-stack developer, your role would involve design development and deploy full-stack applications out of Artificial intelligence projects with a focus on low latency and scalability.
• You also need to optimize the application for better performance and a large number of concurrent users.
• A strong technologist we care about doing things the right way rather than just doing them and thrives in a complex and challenging environment.
Who are we looking for?
• Bachelors / Masters in Computer Science or equivalent with at least 3+ years of professional experience.
• Solid understanding of design patterns, data structures, and advanced programming techniques
• As an Engineer in our team, you will design, code, test, and debug quality software programs.
• Strong software design and architectural skills in object-oriented and functional programming styles.
• Python, Celery, RabbitMQ, Kafka, Multithreading, Async, Microservices, Docker, Kubernetes.
• Experience in working with Machine Learning Pipelines
• Experience in Reactjs.
• Experience in Celery, RabbitMQ/Kafka.
• Experience in Unit Testing Tools.
• Experience in working with SQL & NonSql databases such as MySQL, Mongo DB.
• Exposure to cloud technologies.
• Demonstrate the ability to work in a fast paced and hyper-growth environment where the requirements are constantly changing.
• Nice to have: Experience developing products containing machine learning use cases.
• Familiar with agile techniques like code reviews, pair programming, collective code ownership, clean code, TDD and refactoring.
Baetho is a solution focused company aimed at creating and democratizing
the development of customer experience applications through our
proprietary no code platform.
At Baetho, we offer a fun environment and the chance to work with a highly
skilled and motivated team. Our culture is focused on employee happiness,
customer satisfaction and high-quality execution. If you have the right vibe
and believe in fairness and freedom, you’re a great fit for us.
Working Days: 5 days a week, Monday to Friday (some weekend work
required)
- 3+ years of SDE work experience from Product based companies
- Experience in Java, Spring Boot, MySQL, Kafka, Hbase, AWS
- Experience in Multi threading, distributed systems, Best practices of coding, scaling
• Build the front-end of the application in Liferay
• Develop and manage well-functioning databases and backend applications using Liferay
• Test software to ensure responsiveness and efficiency
• Troubleshoot, debug and fix issues
• Write technical documentation
• Manage VM instances on the cloud
Skills:
• Self-driven, flexible, and innovative with Outstanding analytical, problem-solving, and
communication skills
• Understanding of fundamental design principles for building a scalable application
• Strong knowledge and experience of Java, Liferay and JBPM
• Technical Skills:
• Languages
o Java
o Javascript
o jQuery
o ANSI -SQL
• Frameworks/Libraries/Servers
o Liferay 6.2X
o JBPM 3.x
o Tomcat 7
o Databases
o MYSQL Server
JOB DESCRIPTION – JAVA & JBPM - Q21.D.JAVA005
Quadwave Consulting Private Limited | 2
• OS
o Linux (Ubntu/CentOS)
Qualifications:
• Minimum 4 years of experience in full-stack development of Liferay based portal application
• Minimum 3 years of experience to develop applications using JBPM
• 4-7 years of experience in Java, HTML, Javascript
• Hands-on experience in working with MySQL
• Bachelor’s Degree or equivalent experience
As a software engineer, you will be the part of a team that focuses on building software applications that scale well. You will play a significant role in shaping our software architecture that provides measurable customer value. You understand both technology and business to know the right trade-offs to make. You will be a technical mentor for your team members. You will work closely with your peers, managers, product, design, and operations teams to create solutions that meet business requirements. You will drive engineering and operational excellence across Scatter. You will collaborate with other engineers to surface common pain points, develop solutions and evangelize best practices.
Qualifications:
2-3 years of professional software engineering experience building customer-facing web and/or mobile applications Strong coding skills using Python, Django, is mandatory. Knowledge on Html, Css, React Js, Jquery, Bootstrap or equivalent programming language is added advantage Graduated from Tier 1 or Tier 2 engineering colleges Excellent knowledge of Data Structures and Algorithms Bachelor or Master degree in Computer Science or a related discipline Experience working in an agile environment Self-quick learner and passionate problem solver Excellent debugging and troubleshooting skills, with an enthusiastic attitude to support and resolve customer problems Good oral and written communication skills Above all, an insatiable desire and ability to learn
Nice to have skills:
Experience with large-scale SaaS applications Experience building web and mobile applications Experience designing services on top of cloud infrastructure like AWS, Azure, etc Prior experience in building product from 0-1
Benefits
Ownership and autonomy to drive customer and culture initiatives Opportunity to get mentored and mentor junior engineers Remote work
Hi All,
We are hiring!!
Company: SpringML India Pvt Ltd.
Role:Lead Data Engineer
Location: Hyderabad
Website: https://springml.com/">https://springml.com/
About Company:
At SpringML, we are all about empowering the 'doers' in companies to make smarter decisions with their data. Our predictive analytics products and solutions apply machine learning to today's most pressing business problems so customers get insights they can trust to drive business growth.
We are a tight-knit, friendly team of passionate and driven people who are dedicated to learning, get excited to solve tough problems and like seeing results, fast. Our core values include placing our customers first, empathy and transparency, and innovation. We are a team with a focus on individual responsibility, rapid personal growth, and execution. If you share similar traits, we want you on our team.
What's the opportunity?
SpringML is looking to hire a top-notch Lead Data Engineer who is passionate about working with data and using the latest distributed framework to process large dataset.
As a Lead Data Engineer, your primary role will be to design and build data pipelines. You will be focused on helping client projects on data integration, data prep and implementing machine learning on datasets.
In this role, you will work on some of the latest technologies, collaborate with partners on early win, consultative approach with clients, interact daily with executive leadership, and help build a great company. Chosen team members will be part of the core team and play a critical role in scaling up our emerging practice.
Responsibilities:
- Ability to work as a member of a team assigned to design and implement data integration solutions.
- Build Data pipelines using standard frameworks in Hadoop, Apache Beam and other open-source solutions.
- Learn quickly – ability to understand and rapidly comprehend new areas – functional and technical – and apply detailed and critical thinking to customer solutions.
- Propose design solutions and recommend best practices for large scale data analysis
Skills:
- B.tech degree in computer science, mathematics or other relevant fields.
- 6+years of experience in ETL, Data Warehouse, Visualization and building data pipelines.
- Strong Programming skills – experience and expertise in one of the following: Java, Python, Scala, C.
- Proficient in big data/distributed computing frameworks such as Apache Spark, Kafka,
- Experience with Agile implementation methodology