JobsAisle
N

Software Developer 2

Number Theory

Gurugram, India₹35,000–₹100,000/moAED 1.5K-4.4K/moToday
IndiaCore JavaScalaBig DataSparkUnixLinuxFull Time

Skills Required

PythonJavaAwsAzureErp

Job Description

Job Description As an experienced candidate with 2+ years of experience in design and development in Java/Scala, you should possess the following skills and qualifications: Role Overview: You will be responsible for developing spark jobs, analyzing, designing, testing the complexity of spark jobs, and working on algorithm, data-structure, database, and distributed Systems. Your expertise in Core Java or Scala, Big Data, Spark, and enterprise application design patterns will be crucial for this role. Key Responsibilities: - Develop spark jobs and possess good Oops knowledge - Analyze, design, and test the complexity of spark jobs - Work with algorithm, data-structure, database, and distributed Systems - Utilize Unix/Linux working knowledge - Hands-on experience in Spark, creating RDD, applying operation - transformation-action Qualifications Required: - Core Java or Scala proficiency - Experience in Big Data, Spark - Extensive experience in developing spark jobs - Good Oops knowledge and awareness of enterprise application design patterns - Ability to analyze, design, develop, and test complexity of spark jobs - Working knowledge of Unix/Linux Additional Company Details: The company also values candidates with skills in Python, Spark streaming, Py Spark, Azure/AWS Cloud, and knowledge of Data Storage and Compute side. As an experienced candidate with 2+ years of experience in design and development in Java/Scala, you should possess the following skills and qualifications: Role Overview: You will be responsible for developing spark jobs, analyzing, designing, testing the complexity of spark jobs, and working on algorithm, data-structure, database, and distributed Systems. Your expertise in Core Java or Scala, Big Data, Spark, and enterprise application design patterns will be crucial for this role. Key Responsibilities: - Develop spark jobs and possess good Oops knowledge - Analyze, design, and test the complexity of spark jobs - Work with algorithm, data-structure, database, and distributed Systems - Utilize Unix/Linux working knowledge - Hands-on experience in Spark, creating RDD, applying operation - transformation-action Qualifications Required: - Core Java or Scala proficiency - Experience in Big Data, Spark - Extensive experience in developing spark jobs - Good Oops knowledge and awareness of enterprise application design patterns - Ability to analyze, design, develop, and test complexity of spark jobs - Working knowledge of Unix/Linux Additional Company Details: The company also values candidates with skills in Python, Spark streaming, Py Spark, Azure/AWS Cloud, and knowledge of Data Storage and Compute side.