Lead Scala/SPARK Developer Overall purpose of roleDeveloper, designer and eventual team lead for Composer.Improve the robustness and performance of Composer as a distributed data transformation platform.Periodically work in the architecture team to evaluate, compare and test technologies.Provide assistance to other teams in the form of design consultancy and performance tuningKey AccountabilitiesAccountable for delivering key components for a robust, high scalability data transformation platform.Responsible for designing and delivering components.Key focus for this platform is to drive cost savings through simplifying data manipulation. It attempts to drastically reduce the number of people required to productionise transformations.This is a core platform capability. Thus interactions with architects, business analysts, business users and even traders will be required.Creative and modern problem solver required. This is not a preordained text book style system. We need people to think outside the box and integrate systems with the express focus of improving processes with technology. The ability to simplify complex problems is key. In particular Simplify the creation of data workflowsSimplify the productionisation of data workflowsSimplify the configuration management of workflowsIncrease transparency of business logicImprove the velocity of change to meet regulatory demandProvide richer and more powerful information for business decision makingThis role comprises two distinct functions. The first is the hands-on design, development and deployment of an innovative new product. The second is architectural functions like design, performance tuning and research.This role is not about applying well known patterns to established problems. Our application applies advanced computer science theory to our business domain. We are focused on integrating concepts to build elegant and simple solutions. As such we require an open mind, entrepreneurial spirit, and a keen analytical insight. We require reasoning from fundamentals and concepts not patterns and technologies.The solution combines concepts from program interpretation, functional programming, orchestration, function and data visualisation, data manipulation, distributed computing, immutable data structures and graph theory. We focus on a declarative style to abstract business logic from IT implementation details/technology rather than rely purely on interfaces and messaging.We are constantly evaluating our processes and design to provide a holistic, integrated, end-to-end solution. This saves costs by eliminating unnecessary interactions and coordination. We always attempt to improve visibility of business logic putting as much power in the hands of those that understand the data and business as we can. The ideal candidate will embrace this philosophy and actively advance these ideas.Essential Skills/Basic Qualifications:At least 7 years of IT programming experienceFunctional programming experience in Scala.Exposure to Clojure, Haskell, F# will be beneficial.At least 5 years of hands-on experience with Server Side parallel processing applications (Akka, Multi-threading, JMS)Experience with Hadoop technologies. Spark, HBase, Kafka, Impala, Zookeeper.Experience or interest in in-memory computing. Apache Ignite, Coherence, Gridgain, Hazelcast, AluxioStrong fundamentals in computer science. Interested in applying theory to real practical problems.Experience in Datamodelling with good SQL skills in at least 1 enterprise grade RDMBSExperience in Test/Behaviour Driven Development (including test automation and mocking tools)Understanding of Agile methodologies and SDLC phasesDesirable skills/Preferred Qualifications:A graduate degree with strong grades from a reputable universityContinued education. MOOC (Cousera, Udacity etc.)General understanding of basic risk management principles and financial Greeks (not required)