Senior Software Engineer (Java/Scala/SOLR/Data/Hadoop)

About CodeHall Technology Pvt Ltd
About
Whether it is about building a new system from scratch, maintaining an existing system or making feature upgrades to a legacy system, our team is committed to providing you the best open source solution for your need. When existing systems do not meet your needs, we help you identify the right technology solution. We have your back covered while you focus on delivering the best business experience for your customers.
Connect with the team
Company social profiles
Similar jobs
Location: Bhavnagar [On-site]
Job Type: Full-time
Responsibilities:
- Develop, test, and maintain web applications using Laravel framework.
- Build and integrate RESTful APIs.
- Ensure high performance and responsiveness of applications.
- Troubleshoot issues and debug code.
Requirements:
- 6+ Months of experience in PHP Laravel development.
- Strong knowledge of PHP, Laravel, MySQL, MVC architecture, and OOP.
- Familiarity with HTML, CSS, JavaScript.
- Experience with Git or other version control systems.
Education:
- Bachelor’s degree in Computer Science, Information Technology, or related field.
What We Offer:
- Competitive salary
- Growth opportunities
- Friendly and collaborative team environment
non-metro and rural markets. DealShare has raised series C funding of USD 21 million with key investors like WestBridge Capital, Falcon Edge Capital, Matrix Partners India, Omidyar Network, Z3 Partners and Partners of DST Global and has a total funding of USD 34 million.They have 2 million customers across Rajasthan, Gujarat, Maharashtra, Karnataka and Delhi NCR with monthly transactions of 1.2 million and annual GMV of $100 million. Our aim is to expand operations to 100 cities across India and reach annual GMV of USD 500 Million by end of 2021.
They started in Sept 2018 and had 5000 active customers in the first three months. Today
we have 25K transactions per day, 1 Lakh DAU and 10 Lakh MAU with a monthly GMV of INR 100 Crores and 50% growth MoM. We aim to hit 2 Lakh transactions per day with an annual GMV of 500 Million USD by 2021.
We are hiring for various teams in discovery (search, recommendation, merchandising,
intelligent notifications) , pricing (automated pricing, competition price awareness, balancing revenue with profits, etc), user growth and retention (bargains, gamification), monetisation (ads), order fulfillment (cart/checkout, warehousing, last mile, delivery promise, demand forecasting), customer support, data infrastructure (warehousing, analytics), ML infrastructure (data versioning, model repository, model training, model hosting, feature store, etc). We are looking for passionate problem solvers to join us and solve really challenging problems and scale DealShare systems
You will:
● Implement the solve with minimal guidance after solutioning closure with senior engineers.
● Write code that has good low level design and is easy to understand, maintain, extend
and test.
● End to end ownership of product/feature from development to production and fixing
issues
● Ensure high unit, functional and integration automated test coverage. Ensure releases
are stable.
● Communicate with various stakeholders (product, QA, senior engineers) as necessary to
ensure quality deliverables, smooth execution and launch.
● Participate in code reviews, improve development and testing processes.
● Participate in hiring great engineers
Required:
● Bachelor’s degree (4 years) or higher in Computer Science or equivalent and 1-3 years
of experience in software development
● Excellent at problem solving, is a self thinker.
● Good understanding of computer science fundamentals, data structures and algorithms
and object oriented design.
● Good coding skills in any object oriented language (C++, Java, Scala, etc), preferably in
Java.
● Prior experience in building one or more modules of large-scale, highly available, low
latency, high quality distributed system is preferred.
● Extremely good at problem solving, is a self thinker.
● Ability to multitask and thrive in a fast paced timeline-driven environment.
● Good team player and ability to collaborate with others
● Self driven and motivated, very high on ownership
Is a plus
● Prior experience of working in Java
● Prior experience of using AWS offerings - EC2, S3, DynamoDB, Lambda, API Gateway,
Cloudfront, etc
● Prior experience of working on big data technologies - Spark, Hadoop, etc
● Prior experience on asynchronous processing (queuing systems), workflow systems.
We believe that by empowering the 2 crore MSME manufacturers in India with easy-to-use mobile-first workflow management tools, we can play a pivotal role in realising India’s dream of becoming world’s top-most-desired manufacturing destination and $5 trillion economy. Every
line of code we write, every feature we add, every pixel we create, everything we do helps us get one step closer to our vision.
Our engineering team ardently believes in these 2 core fundamentals:
Polyglot Programming: we are language-agnostic and focus on finding the most optimal and robust solution to a problem - independent of the programming language
Asynchronous Communication: async is an important factor in our team’s productivity. Not only does async produce the best work results, but it also lets people do more meaningful work and live freer, more fulfilled lives.
Objectives 🎯
● Design and develop highly scalable, reliable, and fault-tolerant systems for one of the fastest-growing startups in India
● Participate in code reviews and share knowledge across the team
● Pair with team members on functional and non-functional requirements and spread design philosophy and goals across the team
● Communicate, collaborate and work effectively across distributed teams
● You should understand the user and their behavior and will continuously contribute to making their experience better with each release
Who are we looking for 😎
● Having built scalable backends using Javascript / Typescript would be preferable You should be able to design RESTful APIs that are not overly constrained, and can easily be consumed by the frontend dev
● You have worked with Relational databases like MySQL, Postgres and understand partitioning, sharding, as well as NoSQL databases such as MongoDB/Couchbase etc
● You have worked with searching systems , caching systems and queuing systems You should be at ease with maintaining cloud instances on AWS, GCP, and the like
● Experience with Docker, Kubernetes in production would be prized
● You should have a deep understanding of system design, data structures, and algorithms and understand how to apply them to design pragmatic solutions
● You have experience in identifying, debugging, and resolving complex production issues
● Relevant working experience of at least 3 years
** brownie points if you have experience of Product Startup at scale
Looking candidated from service base or service division of any company.
Minimum Qualification:
- Hands-on working on Java ( {Language understanding - Java 8, Lambdas, Collections, popular frameworks & libraries}, JVM, GC tuning, performance tuning)
- Worked on REST frameworks/libraries like Spring MVC, Spring Boot, Dropwizard, REST Express etc
- Worked on Relational data stores viz. MySQL, Oracle or Postgres
- Worked on Non-relational data stores viz. Cassandra, HBase, Couchbase, MongoDB etc
- Worked on caching infra viz. Redis, Memcached, Aerospike, Riak etc
- Worked on Queueing infra viz. Kafka, RabbitMQ, ActiveMQ etc
Opportunity to work with a Silicon Valley based security and governance start-up.
About Privacera
Privacera, Inc is a California based start-up company that is looking for Senior Software Engineers to work out of our Mumbai/Pune based office. Privacera is a cloud-based product which uses Cloud native services in AWS, Azure and GCP. Privacera is a fast-growing start-up and provides ample opportunity work on all Cloud services like AWS S3, DynamoDB, Kinesis, RedShift, EMR, Azure ADLS, HDInsight, GCP GCS, GCP PubSub and other services.
We are looking for motivated individuals who are keen to work on Cloud or Big Data services or have worked on Cloud and Big Data. If you want to work in a start-up culture and are ready for the challenge, then join us on our exciting journey.
Responsibilities:
- Design, code and debug cloud-native applications.
- Evaluate and identify new technologies for implementation
- Determine operational feasibility by evaluating analysis, problem definition, requirements, solution development and proposed solutions
- Write well designed, testable, efficient code
- Develop software verification plans and quality assurance procedures
- Serve as a subject matter expert
Requirements:
-
5+ years of relevant experience in software development
-
Deep understanding of public cloud infrastructure (AWS, Azure or Google)
-
Experience with large scale distributed systems
-
Ability to troubleshoot distributed systems
-
Prior experience with data encryption, TLS/SSL is a strong plus
-
Experience with Docker and Kubernetes is a plus
-
Deep experience with Java
-
Excellent communication (writing, conversation, presentation) skills, consensus building, Quick learner
Good to have experience in Production support - Tier 4
Experience with these technologies are a plus: AWS, Microsoft Azure, Google Cloud, Cloudera, Snowflake, Mongo DB, Oracle, Databricks, Datastax, Confluent
Job Description
Primary Skills
- Server Side (Java) & AWS serverless framework.
- Must have hands-on experience on serverless framework.
- Design knowledge/experience of cloud-based web application. Familiarity with software design representation tools like astah, visio etc.
- Must have good experience on AWS (Overall knowledge, EC2 Volume, EC2 Security Group, EC2 AMI, Lambda, S3, AWSbackup, CloudWatch, CloudFormation, CloudTrail, IAM, SecretsManager, StepFunction, CostExplorer, KMS, VPC/Subnet)
- Understanding business requirements w.r.t UI/UX.
- Working experience on development/staging/production servers.
- Good testing and verification skills
- Knowledge on SSL certificates and encryption.
- Knowledge on Docker containerization.
Soft Skills
- Excellent interpersonal, oral and writing communication skills.
- Strong Analytical and Problem-solving skills.
- Should have skills to understand and analyzecustomers requirements and expectations.
- Must have experience in interaction with customer
- Work experiences with international cross-culture teams (Good to have)
Secondary SKills
- Scripting using Python
- Good to have knowledge of identity management
- UI/UX
- Knowledge of ReactJS/typescript/bootstrap
- Understanding business usecases w.r.t UI/UX
- Fixing issues wrt to integration on cloud(front end/back end/system/services APIs)
Be Part Of Building The Future
Dremio is the Data Lake Engine company. Our mission is to reshape the world of analytics to deliver on the promise of data with a fundamentally new architecture, purpose-built for the exploding trend towards cloud data lake storage such as AWS S3 and Microsoft ADLS. We dramatically reduce and even eliminate the need for the complex and expensive workarounds that have been in use for decades, such as data warehouses (whether on-premise or cloud-native), structural data prep, ETL, cubes, and extracts. We do this by enabling lightning-fast queries directly against data lake storage, combined with full self-service for data users and full governance and control for IT. The results for enterprises are extremely compelling: 100X faster time to insight; 10X greater efficiency; zero data copies; and game-changing simplicity. And equally compelling is the market opportunity for Dremio, as we are well on our way to disrupting a $25BN+ market.
About the Role
The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for innovative minds with experience in leading and building high quality distributed systems at massive scale and solving complex problems.
Responsibilities & ownership
- Lead, build, deliver and ensure customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product.
- Work on distributed systems for data processing with efficient protocols and communication, locking and consensus, schedulers, resource management, low latency access to distributed storage, auto scaling, and self healing.
- Understand and reason about concurrency and parallelization to deliver scalability and performance in a multithreaded and distributed environment.
- Lead the team to solve complex and unknown problems
- Solve technical problems and customer issues with technical expertise
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Mentor other team members for high quality and design
- Collaborate with Product Management to deliver on customer requirements and innovation
- Collaborate with Support and field teams to ensure that customers are successful with Dremio
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 8+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models, and their use in developing distributed and scalable systems
- 5+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Hands-on experience in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, and storage systems
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Ability to solve ambiguous, unexplored, and cross-team problems effectively
- Hands on experience of working projects on AWS, Azure, and Google Cloud Platform
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and Google Cloud)
- Understanding of distributed file systems such as S3, ADLS, or HDFS
- Excellent communication skills and affinity for collaboration and teamwork
- Ability to work individually and collaboratively with other team members
- Ability to scope and plan solution for big problems and mentors others on the same
- Interested and motivated to be part of a fast-moving startup with a fun and accomplished team
Spark / Scala experience should be more than 2 years.
Combination with Java & Scala is fine or we are even fine with Big Data Developer with strong Core Java Concepts. - Scala / Spark Developer.
Strong proficiency Scala on Spark (Hadoop) - Scala + Java is also preferred
Complete SDLC process and Agile Methodology (Scrum)
Version control / Git








.png&w=256&q=75)
