Spark / Scala experience should be more than 2 years.
Combination with Java & Scala is fine or we are even fine with Big Data Developer with strong Core Java Concepts. - Scala / Spark Developer.
Strong proficiency Scala on Spark (Hadoop) - Scala + Java is also preferred
Complete SDLC process and Agile Methodology (Scrum)
Version control / Git
About Accion Labs
About
Accion Labs, Inc. ranked number one IT Company based out of Pittsburgh headquartered global technology firm.
Accion labs Inc: Winner of Fastest growing Company in Pittsburgh, Raked as #1 IT services company two years in a row (2014, 2015), by Pittsburgh Business Times Accion Labs is venture-funded, profitable and fast-growing- allowing you an opportunity to grow with us 11 global offices, 1300+ employees, 80+ tech company clients 90% of our clients we work with are Direct Clients and project based. Offering a full range of product life-cycle services in emerging technology segments including Web 2.0, Open Source, SaaS /Cloud, Mobility, IT Operations Management/ITSM, Big Data and traditional BI/DW, Automation engineering (Rackspace team), devops engineering.
Employee strength: 1300+ employees
Why Accion Labs:
- Emerging technology projects i.e. Web 2.0, SaaS, cloud, mobility, BI/DW and big data
- Great learning environment
- Onsite opportunity it totally depends on project requirement
- We invest in training our resources in latest frameworks, tools, processes and best-practices and also cross-training our resources across a range of emerging technologies – enabling you to develop more marketable skill
- Employee friendly environment with 100% focus on work-life balance, life-long learning and open communication
- Allow our employees to directly interact with clients
Connect with the team
Similar jobs
A small description about the Company.
It is an Account Engagement Platform which helps B2B organizations to achieve predictable revenue growth by putting the power of AI, big data, and machine learning behind every member of the revenue team.
Looking for PYTHON DEVELOPER.
- Augmenting, improving, redesigning, and/or re-implementing Dolat's low-latency/high-throughput production trading environment, which collects data from and disseminates orders to exchanges around the world
- Optimizing this platform by using network and systems programming, as well as other advanced techniques
- Developing systems that provide easy access to historical market data and trading simulations
- Building risk-management and performance-tracking tools
- Shaping the future of Dolat through regular interviewing and infrequent campus recruiting trips
- Implementing domain-optimized data structures
- Learn and internalize the theories behind current trading system
- Participate in the design, architecture and implementation of automated trading systems
- Take ownership of system from design through implementation
-
Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
-
Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
▪ Distributed Cloud Native Computing including Server less Functions
▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
▪ Micro services Architecture, API Modeling, Design, & Programming -
3+ years of hands-on development experience in Apache Spark using Scala and/or Java.
-
Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.
-
In-depth knowledge of standard programming languages such as Scala and/or Java.
-
3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.
-
3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.
-
Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.
-
Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.
-
Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.
-
Perform benchmarking/stress tests and document the best practices for different applications.
-
Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.
-
Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.
-
Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.
Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.
-
Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.
-
Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.
-
Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.
DeepSource is working on building tools that help developers ship good code. There are over 40 million developers in the world, and all of them write and review code in some form. There’s a massive opportunity to impact how software is built right from where the code is written using automation and intelligence, that not only improves developer productivity but also increases software’s robustness.
The Language Engineering team works on the source code analyzers, including both programming languages and configuration-as-code systems. As a member of the Language Engineering team, you will work on building the best, most comprehensive, Scala analyzer in the world. You will add new rules and Autofixes for finding more issues with code and automatically fixing them. You will be involved with the community to understand the problems with static analysis tools in the Scala ecosystem.
As a member of the Language Engineering team, you will:
-
Identify bad code practices in Scala and write new analyzers to detect them.
-
Improve the coverage of automatically fixable issues.
-
Ensure fewer false-positives are reported by the analyzer.
-
Work on the internal tools that support analysis runtimes.
-
Contribute to open-source static analysis tools.
We’re looking for someone who has:
-
Strong foundational knowledge in Computer Science.
-
At least 3 years of professional software development experience in Scala and Java.
-
Understanding of the nuances of execution of the source code (AST, data flow graphs, etc).
-
Familiarity with Scala best practices followed in the industry.
-
Native experience with Linux/Unix environment.
-
A focus on delivering high-quality code through strong testing practices.
We offer competitive compensation with meaningful stock options, a generous vacation policy, and a workstation of your choice, to name a few of the perks.
Job description
We are looking for a passionate Software Development Engineer to develop, test, maintain and document program code in accordance with user requirements and system technical specifications. As a Software Development Engineer, you will work with other Developers and Product Managers throughout the software development life cycle.
Software Development Engineer responsibilities include analysing requirements, define system functionality and writing code in the companys current technology stack. The candidate is expected to be familiar with the software development life cycle (SDLC) process from preliminary system analysis to tests and deployment. Ultimately, the role of the Software Engineer is to build high-quality, innovative and fully performing software that complies with coding standards and technical design. Your goal will be to build efficient programs and systems that serve user needs.
To be qualified for this role, you should hold a minimum of Bachelor’s degree in a relevant field, like Computer Science, IT or Software Engineering. You should be a team player with a keen eye for detailed and problem-solving skills. If you also have experience in SDLC, Agile frameworks and popular coding languages (e.g., Java), strong computer science fundamentals we’d like to meet you.
Years of experience : 2 to 10 years.
Roles & Responsibilities
The overview of this position (based in Chennai, India) includes:
- Develops, enhances, debugs, supports, maintains and tests software applications that support business units or supporting functions. These application program solutions may involve diverse development platforms, software, hardware, technologies and tools.
- Participates in the design, development and implementation of complex applications, often using new technologies.
- Technology professional with experience in designing and managing the implementation of future looking, flexible and reusable, enterprise applications and components.
- Expert in translating business requirements into an application design that includes Data Model, Web Screens, Web Services, and batch processing.
- May provide technical direction and system architecture for individual initiatives.
- Serves as a fully seasoned/proficient technical resource.
- Deploy programs, gather and evaluate user feedback
- Recommend and execute improvements
- Create technical documentation for reference and reporting
- Develop software verification plans and quality assurance procedures
- Document and maintain software functionality
- Ensure software is updated with latest features
- Good interpersonal and technology understanding skills
- Evaluate open-source components and integrate into product pipeline
Skills and Qualifications
- Hands-on experience in analysis, design, coding, and implementation of complex, custom-built applications.
- Strong Java, development skills (JAVA, J2EE, STRUTS, SPRING, Web Services, Eclipse, UI screens, AngularJS, React.JS)
- Excellent debugging skills
- Strong knowledge on databases (MySQL, MSSQL Server and NoSQL databases)
- Understanding of various deployment servers (Apache Tomcat is a must)
- Understanding of OO skills, including strong design patterns knowledge is a must.
- Strong understanding in creating and maintaining web services.
- Understanding of the software development life cycle
- Experience with Implementation and release management activities
- Good understanding of unit/system and functional testing methodology
- Experience working in large transaction-based systems
- Knowledge of software best practices, like Test-Driven Development (TDD) and Continuous Integration (CI)
- Experience documenting technical functions
- Desire to contribute to the wider technical community through collaboration, coaching, and mentoring of other technologists.
- Experience in Linux based systems, development of shell-based scripts.
Job Training
- Training on the coding paradigms, guidelines, frameworks, usage of the applications would be provided by the engineers
- Periodic training sessions would be conducted by the technical architects in terms of technology and skills to be learnt
- Periodic, structured training would be provided on the applications Hours & Environment
- Typical 40 hours of work a week
- Depending on the requirements, work hours may have to be extended during the day, weekend
Roles and responsibilities
-
- Develop well-designed, performant and scalable microservices
- Write reusable, testable, and efficient code that follow software development best practices
- Integrate data storage solutions including databases, key-value stores, blob stores, etc.
- Expose business functionality to frontend/mobile applications and partner systems through secure and scalable APIs.
- Build integrations with 3rd party applications through apis’ to ingest and process data
- Ensure security and data protection aspects within the applications
- Contribute to devops by building CD/CI pipelines to automate releases
- Ensure high performance and availability of distributed systems and applications
- Interact directly with client project team members and operational staff to support live customer deployments and production issues.
- 4+ years of experience in developing applications using Scala and related technologies.
- Thorough understanding of multithreading concepts and async execution using Actor model.
- Thorough understanding of Play framework, GraphQL and GRPC technologies.
- Experience in using DAL and ORM (Object Relational Mapper) libraries for data access.
- Experiencing in developing and hosting APIs and integration with external applications.
- Experience in building data models and repositories using relational and NoSql databases.
- Knowledge of JIRA, Bitbucket and agile methodologies.
- Good to have knowledge of AWS services like Lambda, dynamodb, kinesis and others.
- Understanding of fundamental design principles behind a scalable application.
- Familiarity with event-driven programming and distributed architectures.
- Strong unit test and debugging skills
- Affinity for learning and applying new technologies and solving new problems
- Effective organizational skills with strong attention to detail
- Experience in working with docker is a plus
- Comfortable in working with Unix/Linux environment
- Strong communication skills — both written and verbal