Find the Best Jobs in Technology & Data

Advance your career! Find real jobs posted by real employers on our award-winning platform.

Job Seekers

Post your resume and get noticed
by industry-leading companies.
Add Resume

Employers

Advertise your job to reach the
best talent in Techology and Data.
Post a Job

Latest Insights

A shrinking pool of qualified candidates surfaced as a top business risk for global executives in risk, audit, finance, and compliance, according to a recent survey by Gartner, Inc. In a time of historically low unemployment where the supply of available workers is much lower, organizations are struggling to find and retain the talent they need to meet their strategic objectives. At No. 3, behind accelerating privacy regulation and cloud computing, this is the first time talent shortage was named a top business risk in Gartner’s quarterly Emerging Risks Report . Cloud computing , which was ranked the No. 1 risk in 2Q18, remains a concern. Cybersecurity disclosure and the artificial intelligence (AI)/robotics skills gap round out the top five concerns among executives surveyed. “In this strong economic environment of significant business growth and record-low unemployment levels, the battle for talent is heating up as employees now have more bargaining power,” said Matthew Shinkman, practice leader at Gartner. “As a result, talent is harder to find and even more difficult to keep.” In the U.S. alone, the number of unfilled jobs rose by 117,000 to 6.94 million from June to July 2018, based on the most recent Job Openings and Labor Turnover Survey. And in the U.K., the unemployment rate is now at its lowest level in four decades, according to the Office of National Statistics.  As business leaders feel the squeeze, the pressure on recruiters continues to intensify. In a Gartner survey of 400 executives on their level of satisfaction with their organization’s ability to attract and retain high-performing talent in the current environment, only 26 percent reported being very satisfied or satisfied. Digital transformation initiatives have only increased this pressure by creating immense competition for workers who are skilled at navigating the increasingly digital environment. Moving From Needs-Driven to Market-Driven Sourcing To mitigate the risk of talent shortage as competition for workers continues to rise, leading organizations are shifting how they source talent. Most recruiting professionals have a needs-driven approach to finding talent, setting the sourcing strategy to fulfill the defined needs of the organization. Instead, Gartner recommends a market-driven approach that ensures the sourcing strategy adapts to evolving external labor market realities and organization needs. This approach includes the following hallmarks: 1. Confront brand weaknesses. Recognize (mis)perceptions that limit access to talent pools, and actively address them. 2. Coach prospects’ career decisions. Understand prospective candidates’ decision-making process, and act as a career coach. 3. Expand the labor market opportunity. Optimize the search criteria to redefine and expand the available talent pools. 4. Cultivate critical talent supply. Reorient learning and development to close skill gaps that exist because of digital transformation. “In today’s tight labor market, where employees have the upper hand, workers are more willing to look for a job with better pay, more generous benefits, and defined career development opportunities — or all three,” says Brian Kropp, Gartner HR practice group vice president. “To retain your best employees, companies need to better understand what matters most to them and help them see how they can advance in their careers with their current company, especially if wage growth continues to remain stagnate,” says Kropp. Article published by Anna Hill Image credit by Getty Images, Westend61 Want more? For Job Seekers | For Employers | For Influencer
In banks and financial institutions, there is a typical barrier that exists between Finance and Risk. The Finance world is very “exact”. A balance sheet or a Profit and Loss (P&L) statement have to be very precise. In the Risk Management world, things are slightly different. It is common that numbers are created as a result from statistical analysis or stochastic simulations; recipes vary in order to create those numbers. It is indeed fascinating. Balance sheet forecasting is a technique being used by Finance professionals since a long time ago; although most of the time, the forecasting models being performed are fairly simple. Perhaps by applying a more analytic approach to the balance sheet and other Finance artefacts, such P&L or income statements could provide us with some leverage. Assets, liabilities and equity can be broken down into small categories and end up with dozens of items (or factors for the purpose of our discussion). Now, let’s imagine we have access to the general ledger system of a bank (or any business), and that we could extract a time series of those factors in the balance sheet. We could simulate future balances for each one of those factors and add them up as a portfolio, which will be seen as projected upcoming balance sheets. If we do this process for thousands of times and multiple time steps, we could create a distribution and apply a confidence level (e.g. 99%) so a worst case scenario could be forecasted on the balance sheet. Since we have the balance sheet time series we got from the GL system, we could also calculate volatilities of each one of the factors and compute correlations amongst them. A similar approach could be applied to income statement or P&L data, as well. By treating the company’s finances with risk analytics techniques, potential benefits could include: Obtaining a quick and dirty forecast on capital (i.e. equity) and losses Devising a direction of your business Observing potential cash-flow (i.e. liquidity) gaps Analysing worst-case scenarios for P&L Screening on potential M&A opportunities (i.e. valuation) Having a sneak preview of future tax impact Identifying areas to be addressed on the different factors to avoid future problems In a bank, the data from the GL most likely would reflect balances on different types of retail products (e.g. home loans, credit cards, deposits, personal loans, etc.). The Monte Carlo Simulation discussed above could be applied to each one of those product types and forecast a potential state by product type. This becomes very powerful for a decision-making process and to learn possible growth on the different product types. Also, exercises on mix optimisation could be performed if this feature was available. On the not-so-uplifting side, a potential challenge could be seen in preserving inter-relationships across the balance sheet factors. In this regard, we could create functions as part of the model, in such way that a segregation of dependent and independent factors becomes available. Also, we need to be mindful that reconciliation of forecasts might be required at different levels of the GL hierarchy. As it can be seen, there are plenty of opportunities that could be targeted by using this approach. In the same fashion that we have explored benefits from a Finance perspective, a Marketing lens could also be applied, opening up other opportunities. Article written by Jaime Noda Want more? For Job Seekers | For Employers | For Influencers
The approach of making government decisions based on the collected information is not a new idea. It can be dated back to the age when data was not stored electronically. Traditionally, data is isolated in different organization silos. Government organizations have collected and managed data only to serve their own business interests and to support internal reporting needs. As a result of this culture, extracting intelligence from the data across different government organizations is difficult. As government decisions become increasingly complex, government organizations are trying to find new data-driven solutions to help government make smarter decisions, improve productivity, and deliver greater citizen services. Data-driven is more than just data sharing or centralizing data. It is to create a culture of data-driven thinking. The following five practices, if they can be accomplished, will help government organizations moving toward this culture. 1. Align data activities with business processes The start point of any data-driven approach is to understand the organization’s business processes including organizational visions, business practices, information flows, and data management systems. It is also important to understand the regulations and policies affecting the data activities. This includes regulations for data collection, standards for data management, and policies for data sharing. With all the overwhelming data in place, the linkage between the organization functions and the available data elements needs to be established. In addition, the data gaps and the potential solutions to fill the gaps also need to be identified and documented. 2. Connect the data nodes A huge volume of data is being collected every day by governments. Many of these data elements have never been shared across jurisdictional boundaries. Data is widely dispersed in different domains of the government information systems. Connecting all these data “nodes” can maximize the values of the data. Many government programs have started establishing cross-jurisdictional data working groups with the aims to promote collaboration in data activities and problem solving. Integrating and coordinating different data elements are difficult to manage and a lot of fears of the associated risks behind the concept. However, these technical and communication barriers are still manageable. A successful case is the U.S. Government’s open data website data.gov . 3. Turn data into decisions Data is useful only if the embedded intelligence is extracted. Usually, governments can progress through three steps for data-driven decisions. The first step is to understand the business goal and choose the right metrics. The designed metrics should be quantitative and measurable, and they should not require a lot of time to explain. Successful metrics will not only impact the current decision; it will continue to be a benchmark to track and assess the status of a specific business process in the future. The second step is to do the research and find the right data. In this era, we hear about Big Data all the time but people rarely talk about right data. If you have the right data, it does not have to be “Big” in volume. A small amount of right data will work well to solve most problems. The third step is to bring data to life. When creating data visualization, focus on the trend and data correlation instead of simply presenting the results. Trend and correlation are usually the two key factors to influence decisions. In addition, try to use simple visualization design and plain language to clearly deliver the messages, and listen to the feedbacks from the decision makers to improve the approaches. 4. Visualize and analyze data with mapping technologies Mapping is a big subset of data visualization. It allows us to combine multiple layers of geographic information or metrics into one visualization piece, which makes it powerful to quickly draw audience’s attention and to create multiple perspectives from data. Interactive mapping applications are becoming emerging tools for data visualization and information sharing. With the interactive mapping applications, data can be visualized at any level of granularity. Sensitive data can be aggregated into boundary-level summaries to enable information sharing, while protecting personal and privacy information. A good mapping product can show how different factors affect the outcomes. Decision makers can use the story told by the maps to plan better government services. For example, overlaying the transit facilities on the population/employment density map can help transit agencies understand locations of transit service demand, proximity of riders to services, and areas with service deficiency. 5. Share government data as Open Data “Open Data is defined as structured data that is machine-readable, freely shared, used, and built on without restrictions.” ( Government of Canada Open Data ) Government organizations collect, produce, and purchase a broad range of different types of data to support government programs. Many years ago, a wealth of these high-value datasets, whether that is country-wide census, road congestion monitoring data or satellite/aerial imagery, were still largely unexploited because of confidentiality, privacy or proprietary. To create new value from data, over the last several years, many government organizations in North America have initialized programs to release government data as open datasets. In 2009, the U.S. government launched data.gov. Today, a total of 323,755 datasets are published in data.gov. Government of Canada also launched its first-generation Open Data Portal data.gc.ca in March 2011. Now there are 69 government organizations in Canada that have launched open data programs. Open Data commitments continue to grow in the years to come because it creates new values and new business opportunities for both government and the community. In summary, the technologies behind data science are evolving rapidly and have made data-driven government possible. Now it is not a matter of whether it can be done; it is a matter of when and how. It’s clear that using data in decision-making can be a huge contributor to improve efficiency and accountability of public services. Moving toward data-driven government is a strong and long-term commitment. However, these five best practices can be embraced by government organizations in a shorter term to start the journey. Article written by Jason Li Want more? For Job Seekers | For Employers | For Influencers
View All Insights