

xcygvhbjnmrtyguhij ctfguhijctyguhj fcgvhjfcgvhbjn xcygvhbjnmrtyguhij ctfguhijctyguhj fcgvhjfcgvhbjnxcygvhbjnmrtyguhij ctfguhijctyguhj fcgvhjfcgvhbjnxcygvhbjnmrtyguhij ctfguhijctyguhj fcgvhjfcgvhbjnxcygvhbjnmrtyguhij ctfguhijctyguhj fcgvhjfcgvhbjnxcygvhbjnmrtyguhij ctfguhijctyguhj fcgvhjfcgvhbjnxcygvhbjnmrtyguhij ctfguhijctyguhj fcgvhjfcgvhbjn

About bodokimcom
About
Company social profiles
Similar jobs
Dear Candidate,
We are urgently Hiring AWS Cloud Engineer for Bangalore Location.
Position: AWS Cloud Engineer
Location: Bangalore
Experience: 8-11 yrs
Skills: Aws Cloud
Salary: Best in Industry (20-25% Hike on the current ctc)
Note:
only Immediate to 15 days Joiners will be preferred.
Candidates from Tier 1 companies will only be shortlisted and selected
Candidates' NP more than 30 days will get rejected while screening.
Offer shoppers will be rejected.
Job description:
Description:
Title: AWS Cloud Engineer
Prefer BLR / HYD – else any location is fine
Work Mode: Hybrid – based on HR rule (currently 1 day per month)
Shift Timings 24 x 7 (Work in shifts on rotational basis)
Total Experience in Years- 8+ yrs, 5 yrs of relevant exp is required.
Must have- AWS platform, Terraform, Redshift / Snowflake, Python / Shell Scripting
Experience and Skills Requirements:
Experience:
8 years of experience in a technical role working with AWS
Mandatory
Technical troubleshooting and problem solving
AWS management of large-scale IaaS PaaS solutions
Cloud networking and security fundamentals
Experience using containerization in AWS
Working Data warehouse knowledge Redshift and Snowflake preferred
Working with IaC – Terraform and Cloud Formation
Working understanding of scripting languages including Python and Shell
Collaboration and communication skills
Highly adaptable to changes in a technical environment
Optional
Experience using monitoring and observer ability toolsets inc. Splunk, Datadog
Experience using Github Actions
Experience using AWS RDS/SQL based solutions
Experience working with streaming technologies inc. Kafka, Apache Flink
Experience working with a ETL environments
Experience working with a confluent cloud platform
Certifications:
Minimum
AWS Certified SysOps Administrator – Associate
AWS Certified DevOps Engineer - Professional
Preferred
AWS Certified Solutions Architect – Associate
Responsibilities:
Responsible for technical delivery of managed services across NTT Data customer account base. Working as part of a team providing a Shared Managed Service.
The following is a list of expected responsibilities:
To manage and support a customer’s AWS platform
To be technical hands on
Provide Incident and Problem management on the AWS IaaS and PaaS Platform
Involvement in the resolution or high priority Incidents and problems in an efficient and timely manner
Actively monitor an AWS platform for technical issues
To be involved in the resolution of technical incidents tickets
Assist in the root cause analysis of incidents
Assist with improving efficiency and processes within the team
Examining traces and logs
Working with third party suppliers and AWS to jointly resolve incidents
Good to have:
Confluent Cloud
Snowflake
Best Regards,
Minakshi Soni
Executive - Talent Acquisition (L2)
Rigel Networks
Worldwide Locations: USA | HK | IN

Only for Indore and nearby candidates
1. We are hiring an Android developer who has expertise in designing as well as development
2. He or she should have expertise in XML design, and API parsing (JSON parsing)
3. If you are a complete fresher then also you can apply as we have started a program for complete freshers as well
Additional information:
1. If you have expertise in Android development and have good working knowledge in API parsing, then please walk in for the interview between 10:30 A.M to 7:30 P.M Monday to Saturday
2. Company name: Logical Soft Tech
3. Website: https://logicalsofttech.com/career
4. Address: 2nd floor, 388, PU4, Scheme 54 PU4, in Infront of Eye Retina Hospital, Vijay Nagar, Indore, M.P
Looking for MS Dynamics AX Technical Consultant, who can join in 10-15 Days
Hybrid Mode- Bangalore/Hyderabad
1. 6 years of relevant ERP experience with a strong understanding of Microsoft Dynamics AX (in addition to degree or years of previous experience), preferably with a Dynamics AX certification
2. Design, develop, and customize solutions within Microsoft Dynamics AX 2012.
3. Implement AX customizations using X++, MorphX, and SSRS reports.
4. Data Migration: Experience in data migration using DIXF and other data tools
Organization Name: Agami Realty
Company Website: https://agami.co.in/
Position Name: Real Estate Sales Executive
Job Location: Boisar
Nature of Job: Permanent
Monthly Salary: Rs.10,000 to Rs.14,000
Candidates residing in Palghar, Boisar, Vangaon and Dahanu will be preferred
Sales Executive job responsibilities include, but are not limited to:
- Telephonic conversation with prospects and explaining project amenities and pricing
- Set up personal meeting with the potential clients to ensure maximum site visits
- Showing sample flat and explaining the project in detail
- Explaining the customer about the EMI and loan eligibility
- Housing loan processing follow up till disbursement
Requirements:
- Excellent knowledge of MS Office
- Fast learner and passion for sales


Be a part of the growth story of a rapidly growing organization in AI. We are seeking a passionate Machine Learning (ML) Engineer, with a strong background in developing and deploying state-of-the-art models on Cloud. You will participate in the complete cycle of building machine learning models from conceptualization of ideas, data preparation, feature selection, training, evaluation, and productionization.
On a typical day, you might build data pipelines, develop a new machine learning algorithm, train a new model or deploy the trained model on the cloud. You will have a high degree of autonomy, ownership, and influence over your work, machine learning organizations' evolution, and the direction of the company.
Required Qualifications
- Bachelor's degree in computer science/electrical engineering or equivalent practical experience
- 7+ years of Industry experience in Data Science, ML/AI projects. Experience in productionizing machine learning in the industry setting
- Strong grasp of statistical machine learning, linear algebra, deep learning, and computer vision
- 3+ years experience with one or more general-purpose programming languages including but not limited to: R, Python.
- Experience with PyTorch or TensorFlow or other ML Frameworks.
- Experience in using Cloud services such as AWS, GCP, Azure. Understand the principles of developing cloud-native application development
In this role you will:
- Design and implement ML components, systems and tools to automate and enable our various AI industry solutions
- Apply research methodologies to identify the machine learning models to solve a business problem and deploy the model at scale.
- Own the ML pipeline from data collection, through the prototype development to production.
- Develop high-performance, scalable, and maintainable inference services that communicate with the rest of our tech stack
Roles and responsibilities:
Actively participate in requirement analysis and ensure all scenarios/use cases are captured
Good verbal and written communication skills
Strong analytical and problem-solving skills
Coordinate with QA and Product Management to ensure proper delivery and execution of product/feature deliverables
Prior experience in Agile-Scrum software development environment
Perform effective code reviews submitted by peers
Perform Unit testing (tools – JUnit, Mockito)
Well-versed with code coverage concepts and tools
Familiar with continuous integration tools
Well-versed with refactoring concepts and Code-smell concepts Propose and implement technical solutions
Deliver relevant technical artifacts based on standard practices.
Deliver a program on time with high quality
Understand and Review requirements w.r.t. the business needs to Participate regularly in project meetings with the customer
Skillset
Must-Have Java 7, 8, Spring, SpringBoot, Microservices JPA/Hibernate Rest Web Services
AWS – Usage of at least any of SES / SQS / SNS / S3 / Lambda / DynamoDB Jenkins Bitbucket, GIT
Basic Knowledge of Javascript, HTML Struts, EJB
Experience with SQL and No-SQL technologies is required (e.g. Mongo DB, Dynamo DB)
Prior experience in an Agile-Scrum software development environment is required
Experience with Jira and Confluence preferred
Good to have AWS EC2, Elastic Beanstalk, Docker Swings Datadog/Splunk
Thanks and Regards,
Seema Bisht
Senior Talent Acquisition Partner || Trantor Inc.
Plot No. G-9, IT Park - Chandigarh, India - 160101
We are looking for a highly motivated developer with at least 1+ years of strong hands-on experience in Java to join our startup. You would be playing a pivotal role in contributing to the initial tech stack. You would be further responsible for designing and implementing product requirements that are highly usable, scalable, extensible, and maintainable. You should be comfortable working across different technologies/frameworks that we work on - Microservices, Java, Spring, Spring Boot, MySQL, Kubernetes, AWS.
Responsibilities and Duties:
- Design and build scalable REST APIs on Spring Boot
- Develop, test, tune for performance and deploy microservices
- Collaborate with the team, optimize and refactor the back-end architecture
- Maintain high standards of quality for code, documentation and other deliverables
- Active cross-team coordination would be expected
What are we looking for?
- 1+ year experience in Core Java & backend technologies
- Good working knowledge of design patterns & OOAD
- Excellent analytical and problem-solving skills
- The skills that we consider: Java, MySQL/RDS, Spring/ Play, Maven, Redis, Kafka/SQS, Elasticsearch, AWS
- Experience in micro-services
- Previously worked in a startup



About us
DataWeave provides Retailers and Brands with “Competitive Intelligence as a Service” that enables them to take key decisions that impact their revenue. Powered by AI, we provide easily consumable and actionable competitive intelligence by aggregating and analyzing billions of publicly available data points on the Web to help businesses develop data-driven strategies and make smarter decisions.
Data Science@DataWeave
We the Data Science team at DataWeave (called Semantics internally) build the core machine learning backend and structured domain knowledge needed to deliver insights through our data products. Our underpinnings are: innovation, business awareness, long term thinking, and pushing the envelope. We are a fast paced labs within the org applying the latest research in Computer Vision, Natural Language Processing, and Deep Learning to hard problems in different domains.
How we work?
It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest data problems that there are. We are in the business of making sense of messy public data on the web. At serious scale!
What do we offer?
● Some of the most challenging research problems in NLP and Computer Vision. Huge text and image
datasets that you can play with!
● Ability to see the impact of your work and the value you're adding to our customers almost immediately.
● Opportunity to work on different problems and explore a wide variety of tools to figure out what really
excites you.
● A culture of openness. Fun work environment. A flat hierarchy. Organization wide visibility. Flexible
working hours.
● Learning opportunities with courses and tech conferences. Mentorship from seniors in the team.
● Last but not the least, competitive salary packages and fast paced growth opportunities.
Who are we looking for?
The ideal candidate is a strong software developer or a researcher with experience building and shipping production grade data science applications at scale. Such a candidate has keen interest in liaising with the business and product teams to understand a business problem, and translate that into a data science problem.
You are also expected to develop capabilities that open up new business productization opportunities.
We are looking for someone with a Master's degree and 1+ years of experience working on problems in NLP or Computer Vision.
If you have 4+ years of relevant experience with a Master's degree (PhD preferred), you will be considered for a senior role.
Key problem areas
● Preprocessing and feature extraction noisy and unstructured data -- both text as well as images.
● Keyphrase extraction, sequence labeling, entity relationship mining from texts in different domains.
● Document clustering, attribute tagging, data normalization, classification, summarization, sentiment
analysis.
● Image based clustering and classification, segmentation, object detection, extracting text from images,
generative models, recommender systems.
● Ensemble approaches for all the above problems using multiple text and image based techniques.
Relevant set of skills
● Have a strong grasp of concepts in computer science, probability and statistics, linear algebra, calculus,
optimization, algorithms and complexity.
● Background in one or more of information retrieval, data mining, statistical techniques, natural language
processing, and computer vision.
● Excellent coding skills on multiple programming languages with experience building production grade
systems. Prior experience with Python is a bonus.
● Experience building and shipping machine learning models that solve real world engineering problems.
Prior experience with deep learning is a bonus.
● Experience building robust clustering and classification models on unstructured data (text, images, etc).
Experience working with Retail domain data is a bonus.
● Ability to process noisy and unstructured data to enrich it and extract meaningful relationships.
● Experience working with a variety of tools and libraries for machine learning and visualization, including
numpy, matplotlib, scikit-learn, Keras, PyTorch, Tensorflow.
● Use the command line like a pro. Be proficient in Git and other essential software development tools.
● Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
● Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
● It's a huge bonus if you have some personal projects (including open source contributions) that you work
on during your spare time. Show off some of your projects you have hosted on GitHub.
Role and responsibilities
● Understand the business problems we are solving. Build data science capability that align with our product strategy.
● Conduct research. Do experiments. Quickly build throw away prototypes to solve problems pertaining to the Retail domain.
● Build robust clustering and classification models in an iterative manner that can be used in production.
● Constantly think scale, think automation. Measure everything. Optimize proactively.
● Take end to end ownership of the projects you are working on. Work with minimal supervision.
● Help scale our delivery, customer success, and data quality teams with constant algorithmic improvements and automation.
● Take initiatives to build new capabilities. Develop business awareness. Explore productization opportunities.
● Be a tech thought leader. Add passion and vibrance to the team. Push the envelope. Be a mentor to junior members of the team.
● Stay on top of latest research in deep learning, NLP, Computer Vision, and other relevant areas.


