×

Loading...

Topic

This topic has been archived. It cannot be replied.
  • 工作学习 / 职位情报 / Contract Job Opportunity-Analyst/Developer in Scotia Bank Contract Job Opportunity-Analyst/Developer in Scotia Bank +1
    本文发表在 rolia.net 枫下论坛Duration: 10/09/2017 to 03/31/2018
    Location: Toronto
    Group
    • GWWRT - Global Capital Markets

    Big Data Developer - Python

    Purpose of Job:

    The Global Wholesale, Wealth, and Risk Technology – Analytics team is responsible for servicing Global Capital Markets worldwide and delivering data storage and analytics solutions to all business lines, Equities, FX, FI, Derivatives, Commodities, etc.

    An End User Computing prototype Big Data solution has been developed for reporting Client Metrics and Sales. This now needs to be productionized, made to be industrial strength, enhanced and maintained. The incumbent is responsible for the hands-on development and testing of an Hadoop based application set to aggregate and report on data coming from all of the trading lines of business, merge these with appropriate standing and reference data and make reports available on a self service basis to users. The role carries responsibility for the detailed design and specification of the solution with an emphasis on pragmatic delivery.

    The incumbent will be expected to plot the route from the current implementation to a robust, maintainable and well document system set using the most expedient route paying attention to both eventual quality and cost of delivery.

    Experience in Capital markets and a good knowledge in financial instruments, ranging from derivative products (IR, Credit, and Equity) to foreign exchange is bonus.

    Looking for creative, self-starting, results-oriented, and highly motivated individual with attention to details and excellent problem solving skills. The incumbent needs to be able to multi-task and work in a fast paced and agile environment

    The incumbent must also possess a very good understanding of current IT practices, systems design and development techniques, including testing methodologies, and keep current with rapidly changing technology.

    Minimum of 7-10 years of industry experience.

    Key Accountabilities:

    1. Develop and enhance an Hadoop based data aggregation and reporting application taking it from prototype to production deployment
    2. Rationalize the technology stack being used for the application and harmonize it with the broader Global Capital Markets Technology Strategy.
    3. Develop and then execute a plan to put a robust set of non-functional controls, processes and procedures around the application to ensure maximum uptime and minimize mean time to recovery.
    4. Assist with integration of the application into the clients state of the art DevOps infrastructure for automated build/ automated deployment.
    5. Development of feeds into the system using a variety of technologies, from code through conventional ETL tools and onto open source tools like Apache NiFi.
    6. Documentation of the system to a level which can be handed over to a third line support team.
    7. Preparation of specifications and documentation for all software developed. Use of standard project management and team collaboration tools
    8. Collaboration as part of an Agile development team, participation in daily standups, preparation of work estimates, identification of blocking and critical path steps
    9. Hands on coding in one or more of the programming languages identified in the Functional Competencies listed below
    10. Accountable for delivery according to pre agreed time and budgetary constraints

    Functional Competencies:

    The incumbent must have the below:
    1. Expert hands on programming skills in at least two of: Spark, HDFS, Apache Parquet, Django, Pentaho, Tableau, Apache Airflow, Scala
    2. Expert hands of all of: Python, Bash, SQL (PostgresSQL preferred)
    3. 5+ years of data integration and reporting experience
    4. Experience working in an agile team

    It would be nice if the incumbent had at least one of:
    5. A very good knowledge of financial instruments, how they are represented in trading and risk systems.
    6. Experience working in banking/ capital markets
    7. Understanding of DevOps integration

    Educational Requirements:
    1. A recognized under-graduate degree in business, mathematics, computer science or a related discipline.
    2. A recognized post-graduate or masters degree in business, computer science, finance, mathematics, CFA and/or FRM would be an asset.
    3. Or an awesome GitHub and lots of coding experience.


    Candidate Requirements/Must Have Skills:

    1. Expert hands on programming skills in at least two of: Spark, HDFS, Apache Parquet, Django, Pentaho, Tableau, Apache Airflow, Scala
    2. Expert hands of all of: Python, Bash, SQL (PostgresSQL preferred)
    3. 5+ years of data integration and reporting experience
    4. Experience working in an agile team

    It would be nice if the incumbent had at least one of:
    5. A very good knowledge of financial instruments, how they are represented in trading and risk systems.
    6. Experience working in banking/ capital markets
    7. Understanding of DevOps integration

    Educational Requirements:
    1. A recognized under-graduate degree in business, mathematics, computer science or a related discipline.
    2. A recognized post-graduate or masters degree in business, computer science, finance, mathematics, CFA and/or FRM would be an asset.
    3. Or an awesome GitHub and lots of coding experience.

    Degrees or certifications:
    • Bachelor's degree or Post-Secondary education

    Additional notes: This resource will be part of a new team being built. The resource will be working with client valuation and analytics, looking at the revenue stream, segment out clients who are most valuable, least valuable, doing matrix reporting from profitability set etc. They will be communicating with traders, middle office, IT teams.更多精彩文章及讨论,请光临枫下论坛 rolia.net