top of page

Search Results

56 items found for ""

  • Report Summarising Key Insights and Recommendations from Data Governance Act Webinar Now Available

    Last February, DUET joined forces with three other H2020 projects (Policy Cloud, Cyberwatching, URBANITE) to discuss the proposed regulation on data governance. The webinar attracted a lot of interest from across Europe; in total, 121 participants joined the event to hear from industry experts about the Data Governance Act and its implications for policy making. The published report provides an overview of the webinar, as well as relevant links to the recording and slides. It includes a comprehensive summary of the audience questions and panelist responses from the Q&A session. Seven practical recommendations on how to prepare for the upcoming DGA are provided at the end, covering internal policy, cybersecurity, the data ecosystem, user-centric approach, data governance tools, technology, and digital twins. >> Download the report here <<

  • Digital Twins for Climate-Neutral Smart Cities

    On 22 April 2022, DUET partners moderated a breakout session organised as part of the DigitALL Public conference. The aim was to exchange ideas and stimulate a discussion on how digital twins can support cities in their green transition. Below we share some thoughts on the subject, including a round-up of the main takeaways from the side event. Climate change is a global concern. We urgently need new thinking and strategies to address it and achieve sustainable growth. The European Green Deal is the EU’s response to the challenge, with the overall goal being the transformation of the Union into a modern, resource-efficient and competitive economy by 2050. Digital innovation will play a key role in this transition. By harnessing new technologies and data, cities can start realising their long-term vision (e.g. net zero) while effectively tackling more immediate problems such waste, traffic and air pollution. European Commission has set ambitious targets for climate neutrality: 100 European cities by 2030, and all European cities by 2050. While challenging, the numbers are not unattainable. Many European initiatives, among them DUET and its precursor PoliVisu, have supported cities on this journey, providing a platform for co-creation of data-driven solutions that meet local needs and priorities. PoliVisu was a 3-year project that used data visualisations as an opportunity to close the gap between long-term policies (e.g. Green New Deal) and short-term operational decisions (e.g. road closures). The following examples illustrate how this was achieved in practice. Mechelen, Belgium: Created ‘school streets’ which involved citizens putting a cheap sensor in their windows and collecting data on the number of vehicles passing their house. Data was used in a community dashboard to work out which streets around schools should be closed to traffic during the school run to make it safer for children to walk to and from school. The Mechelen 'schoolstaat' dashboard Pilsen, Czechia: Used visualisations of their road network to calculate and monitor the impact of the city’s Sustainable Urban Mobility Plan. Near real-time views made it possible for authorities to preemptively intervene to avoid a major congestion. This helped to keep the traffic flowing, ultimately reducing the amount of noxious gases thrown in the atmosphere. Pilsen's Traffic Modeller solution DUET takes data visualisation to the next level. By using the digital twin technology, the project combines multiple models and data streams in a common environment, allowing users to perform various what-if analysis related to traffic, air quality and noise pollution. For instance, users can simulate the effects of new developments to see how building a new apartment complex can alter mobility demand and cause new traffic from typical destinations e.g. work zones, universities, residential areas. With regards to air quality, calculations can be performed for several pollutants (PM10, PM2.5, NO2) based on weather information (wind direction, wind speed) and spatial conditions, as well as traffic flows. And when it comes to noise modeling, the results can be displayed as maps in a digital twin, which local authorities can use to assess citizens’ exposure to noise-generating activities. DigitALL Public conference presented an excellent opportunity for us to share all these ideas with other cities (e.g. Bologna, Fredrikstad), and also to see how they think digital twins can help them in their green transition. Below were provide a roundup of the main take-aways from the breakaway session: Digital twins are seen as a massive investment. Those leading the way need to show others how to embark on this journey in a cost-effective manner. A few validated case studies, ideally aligned with climate goals, is a great way to get started The quest for perfect data in support of decision making still puts off many cities. It is an ideal that may never be achieved. Nevertheless, it is worth investing in data literacy to improve our understanding of what data can and can’t deliver, of its benefits and drawbacks Digital leadership and championing is crucial at all levels of government to drive digital twin adoption at scale. Cities that have succeeded in that highlight the need to target politicians, city managers and influential business groups for a multi-stakeholder buy-in 12 cities spanning six countries are currently working on urban digital twins as part of the LIVING-IN.EU initiative. Want to join? Click here.

  • Building Secure and Trusted Digital Urban Twins

    Digital urban twins must handle large amounts of information transmitted to them from the physical world. The challenge is to ensure that this information is secure and trusted throughout the whole process. Legacy systems, insecure communications, unpatched vulnerabilities in software and hardware all jeopardise information security. Add to this mix the ever increasing number of low-cost sensor devices and you get a pretty big surface area susceptible to a wide range of attacks. Threats relevant to IoT infrastructures include Information interception by means of network reconnaissance, session or protocol hijacking Network outages leading to a loss of support services, failures of devices or an entire system Abuse in the form of malware, DDoS attacks, identity theft, privacy violations Damage and malfunctions caused by exploits, information leaks, data disclosures, third party failures Unintentional damage caused by erroneous use of devices and systems, or modification of source code and data Legal consequences resulting from the violation of rules and regulations, as well as breach of contract obligations Figure 1. DUET threat taxonomy Many of these threats are relevant to DUET. To protect our ecosystem, we’re going to implement specific measures, or Technical Controls (TCs), in five key areas: run-time authentication, authorisation and monitoring; communications; and the software development lifecycle. Run-time authentication is required to connect to a DUET backend service, visualisation system or sensor. There are many ways to ensure secure authentication. For example, through standardised and effective cryptography and security protocols, such as TLS that helps protect the confidentiality, authenticity and/or integrity of data and information (including control messages) in transit. Measures such as rate limiting can be applied to control requests to backend service, which can minimise the risk of automated attacks, while specific methods such as two-factor authentication should be enabled by default for critical DUET subsystems and actions. For run-time authorisation it’s important to implement access control whereby the system verifies that users and applications have the right permissions. Security roles and privileges should be established for both systems or users and fine-grained authorisation mechanisms should be in place for limiting the actions allowed. Furthermore, applications and users shall follow the principle of least privilege and operate at the lowest privilege level possible. The implementation of run-time monitoring and auditing requires regular checks to verify device behaviour, detect malware and discover integrity errors. For example, anomaly-based methods compare the observed network traffic with normal traffic and attacks such as DoS are detected when irregular activities are spotted. In addition, a logging system is needed to record events relating to user authentication, management of accounts and access rights, modifications of security rules, and the functioning of the system. From using modern cryptographic hash algorithms, to implementing a DDoS-resistant and Load-Balancing infrastructure, to accepting devices and APIs only via secure protocols (https), there are many ways to ensure that communications are secure and trusted. We also want to stress that all errors should be handled correctly, that all input/output data is validated before it is accepted, and that queries use parameterisation (or other equivalent security measure) to avoid code injections e.g. XSS, CSRF, SQL. Under data protection and compliance we want to reiterate principles that most people working in the GDPR environment are probably familiar with: personal data must be collected and processed fairly, lawfully and in a transparent manner; it should never be collected and processed without the data subject’s consent; personal data should be used only for those purposes for which it was originally collected (data minimisation principle), and that any further processing of personal data is compatible and that the data subjects are well informed. To make our digital twins compliant with the GDPR, we will link a data stream within the system with its owner. Such dynamic consent management will enable citizens to give or revoke consent from any service that uses their data. Finally, security measures for the software development lifecycle will vary from stage to stage. For example, under planning we foresee mechanisms for self-diagnosis and healing to expedite recovery from failure, malfunction or a compromised state; under authentication and authorisation - a separation of duties to enable collusion-resistant processes that minimise risk exposure; under development - libraries and third-party components that are patched for latest known vulnerabilities; under monitoring and auditing - protections against privilege abuse and software logs registering all relevant security events. All TCs for this and other areas are shown in the diagram below. Figure 2. Taxonomy of DUET security measures The foregoing security measures will cover every underlying asset in the DUET ecosystem, including sensor devices used to collect information on traffic, weather and noise pollution; external systems and network elements (routers, gateways, virtual machines); middleware; computing infrastructure; and information in different states (at rest, in transit, in use), as well as metadata. The ultimate goal is to provide a digital twin solution that can be trusted by end-users who are becoming increasingly conscious of both cybersecurity risks and their rights as data subjects. If you would like further details on DUET’s security architecture, feel free to drop us a line or check this report.

  • Data Governance Act: Practical Implications for Digital Twins

    On 25 November 2020, the European Commission published a proposal for a regulation on European data governance, otherwise known as the Data Governance Act (DGA). The legislation aims to make more data available for reuse by increasing trust in data intermediaries and strengthening data-sharing mechanisms across the EU. To discuss DGA’s implications for decision making, DUET joined three other European projects (Policy Cloud, Cyberwatching, URBANITE) for a webinar that took place on 16 February 2021. (DUET’s presentation starts at 33:00.) In DUET, we certainly see DGA as an opportunity. Digital twins are an approximation of reality, a simplification that can be improved with more data, provided that this data is of certain quality and there is enough computing power to process it. DGA promises to make more data available from different owners (public sector, private sector, citizens) and sectors e.g. mobility, environment, health. Moreover, new data is expected to be fully secure, trusted and interoperable, which bodes well for emerging digital twin use cases (e.g. digital twins of citizens that use biometric data) that hold great potential but have not yet hit the mainstream precisely because of the trust issue. We believe that the relationship can work both ways, meaning that, as well as benefiting from the DGA, digital twins can help improve the public acceptance of the legislation. As news about digital twins and their benefits spread through international case studies, stakeholders may be more willing to embrace the data sharing ethos that the DGA tries to promote. To prepare for the DGA, public administrations would be well-advised to review available use cases, included those provided by DUET, and answer some simple questions, starting with - is digital twin a relevant solution for our city and citizens? If the answer is yes, the next step would be to determine what type of a digital twin is needed. Some cities may see a need for a smart city twin that provides a digital replica of infrastructure objects. Some may want to see energy, traffic and pollution all simulated in a single environment. Others may already have a large-scale IoT testbed, so for them a health-oriented digital twin could be a priority. Once this is settled, stakeholders will have a better understanding of what information should be shared through the European data spaces.

  • Computer Says "Yes!" - Bruges is Building Its Digital Twin

    DUET's project partner imec is going to build a digital twin solution for the city of Bruges. The local authority is keen to leverage this advanced technology to improve operational planning and long-term policy making in areas such as traffic and air quality. What's especially exciting for us is that lessons learned from the DUET project will inform the development of Bruge's digital twin which is scheduled to start later this year. Read an article (in Dutch) about Bruge's digital twin here.

  • Sharing Our Knowledge at CITYxCITY

    Cities are struggling to unlock value from their data due to a myriad of factors, including the lack of data quality, consistency, accuracy, coverage, freshness, and completeness, plus the lack of data understanding (data literacy) to enable meaningful interpretation of data. Aside from the data itself, an overreliance on traditional methods combined with the lack of infrastructure with advanced analytics capabilities to analyse the volume and variety of city data fast enough has also hampered progress. DUET helps cities address these challenges by providing access to the needed computing power, by making data easier to understand, and by establishing ethical principles for data driven decision making. Cloud computing has not been used for high performance computing to the same degree as other use cases for several reasons, chief among which is cost. DUET sets out to advance this area by providing a new shared approach for its use in policy making and city management using the Digital Twin technology. The Digital Twin infrastructure with its deep-dive visualisation platform for policy experimentation will boost collaboration and policy innovation and bring new discoveries and intelligence through novel views of city data. These were the key messages that DUET’s coordinator Lieven Raes of Informatie Vlaanderen presented at CITYxCITY conference. The event is an initiative of Open & Agile Smart Cities; its goal - to guide cities and communities through their digital transformation journey. CITYxCITY brought together many specialists from across Europe and beyond. We are excited that our presentation attracted the interest of one computer science expert from the publishing industry who even suggested to write a book about the project, an invitation that we gladly accepted! We will keep you posted on the outcome of this exciting new opportunity. In the meantime, check out Lieven's presentation from the event (starts 26:29).

  • No Virus Can Stop-us from Promoting DUET!

    One of the things that Covid-19 pandemic couldn’t stop were the online events. All the physical events were moved into the online sphere and GEOBIM 2020 conference was one of them. The GEOBIM 2020 conference initiated a dialogue on the preparedness of the Architecture, Engineering and Construction industry in using integrated geospatial and BIM technologies along with other digital technologies (or 4IR technologies) and collaborative workflows. On December 3rd, DUET was presented by its coordinator, Lieven Raes from Informatie Vlaanderen, Belgium, at the session Innovating the Built Environment with Digital Twins. Lieven discussed about the Digital Disruption of Urban Decision Making and he shared what it means to build a digital twin for better policy making.

  • DUET Joins International Collaboration to Improve Digital Twin Standards

    Attempts to define standards for constructing a digital twin have been made previously. However, there is still a substantial gap in both literature and practice as regards unified models and interactions of data fusions for the physical and virtual data exchange. This is especially the case with new domains (e.g. smart city) where the digital twin paradigm has started to take hold only recently. In a bid to close this gap, DUET started working with international initiatives that pilot digital twin technologies to improve urban policy making. Specifically, the cooperation involves six cities from the CRUNCH project (Eindhoven, Southend-On-Sea, Taipei, Uppsala, Gdansk, Miami) and KnowKnow, a UK-based digital innovation SME. The first group call took place on 9 December 2020 and was mostly an introductory meeting during which participants had a chance to present their projects, discuss existing frameworks (e.g. CDBB’s Gemini Principles) and define an action plan for future collaboration. For DUET, this was an opportunity to share its vision for digital urban city twins and explain the role of a data broker in its realisation. Acting as a container for models, data and simulations, the data broker framework facilitates the flow of information from diverse static, historic, open, and real-time data sources, and translates it into easily digestible output and insights for smart city decision-makers. DUET’s T-cell architecture acts as a central broker onto which different data sources, models, visualisations, interaction clients and other components connect In order for DUET to offer useful insights to city planners and other users, it must handle non-interoperable data and then map it to a uniform format or ontology. The system is configured to normalise all incoming data onto a common data model. This data model is aligned with common standards and can be extended by the administrator to support more data formats. Mapping data to a unified model makes it easier to assess the compatibility of datasets and to integrate them into a working solution. The main idea behind the DUET’s framework is to make it easy for any city no matter its size to benefit from opportunities provided by the digital transformation. As cities begin to realise the full potential of urban data and latest technologies, local stakeholders can engage in a meaningful discussion to explore and co-create effective policy interventions in the key Horizon 2020 target areas of transportation, environment and health. In this respect, digital twins act as a conversation starter, an enabler of two-way policy dialogue that brings citizens closer to smart cities. While the idea of using digital twins in smart city policy co-creation is rather new, we don’t necessarily see it as an emerging technology. For us, digital twins is a concept that pulls together several existing mature technologies that became fashionable in the last decade or so, such as AI, Internet of Things (IoT), and big data.

  • DUET at European Big Data Value Forum

    Unfortunately due to COVID-19 ongoing situation most of the face-to-face events had to find innovative ways to temporarily switch to virtual events at least for the duration of the lockdowns and the pandemic. European Big Data Value Forum was one of these events that shifted gear and managed to find an opportunity in a problematic situation. With the help of the Whova platform what this team managed to make virtually it is just brilliant and it is not me saying it but the numbers: 1758 Tickets Whova Sold 1253 Announcements Opened DUET, H2020 project, was a sponsor of this event and we are happy to report that it had 5979 impressions. An impression is when an attendee clicks either on the sponsor banner or navigates to the sponsors customize resources. Each view is counted towards the sponsor impressions total. More numbers and reports for each project can be found on the full report provided by Whova. On Thursday 5th of November, DUET together with PolicyCloud and Urbanite projects organised a joint Parallel Session on Smart government with co-creating services with the use of AI and Data. The session showcased examples of Smart Government initiatives across Europe that integrate AI, Digital Twins and Big Data solutions. DUET's case was presented by Mr. Lieven Raes from Information Flanders, the coordinator of DUET project. The material and recordings from most of the EBDVF sessions, including the one where DUET was presented are now available on the event EBDVF webpage. For more information on DUET project please do not hesitate to follow us on Twitter and join our mailing list to be the first to find out our news.

  • Smart City Models for the DUET Solution

    In one of our previous blogs we presented the progress made by pilots toward creating actionable what-if scenarios, otherwise known as epics, to support decision making across three main policy fields: transportation, environment, health. In this article, we would like to elaborate on the traffic, air quality and noise models that will form part of the DUET Digital Twin solution and that ultimately will be used to support the implementation of user scenarios (see D2.3 for a full list) in the participating pilot locations. Traffic models Traditionally, specific instances of a wide range of potential transportation models were developed – and in many cases fine-tuned over years or even decades – by specific stakeholders in the domain. A national transportation authority would develop and maintain region- or even nation-wide explanatory models for average traffic loads during peak periods in all transportation modes and infrastructures within their jurisdictions. Such models are often used in various scenarios and cost-benefit analyses related to large infrastructure projects or tax simulations. Local traffic controllers, on the other hand, would develop very precise, minute-to-minute dynamic models of traffic operations on a signalised corridor that they are managing. For them, traffic models offer a way to explore what-if scenarios for better incident response or real-time optimisation for different traffic types. In theory, the more detailed model layers can be configured for any region or neighborhood in the larger territory. However, the usefulness of such models depends entirely on their empirical validity. At present, no automated calibration procedures exist, and there are generally no commonly agreed guidelines on which data is required to guarantee a certain level of validity of the model outputs. Within DUET, the more refined models will be configurable in principle over the entire territory, but in practice can be trusted empirically only for those zones for which substantial calibration efforts have been performed in the context of specific pilot use cases and using dedicated data sources. DUET will showcase three different kinds of traffic models: static, dynamic and a local mobility model, which can be thought of as a multi-modal traffic state estimation. Models based on Static Traffic Assignment (STA) capture two basic assumptions that we usually associate with traffic. First, people strive to minimize their travel times by choosing the shortest route, this behaviour eventually leads to something we call ‘equilibrium’ - a state in which all the used routes from A to B have the same travel time, it’s equivalent to saying that no traveller can unilaterally reduce their travel time by switching to a different route. Second, the travel time on a road increases with the amount of people that use it. With properly calibrated demand data, STA can provide a useful overview of traffic flows in a system and facilitate predictions based on alternate road network graphs or different demand patterns. This covers a wide range of scenarios that could be explored by a Digital Twin. A non-exhaustive list includes Mobility effects of new developments e.g. building a new apartment complex can alter the mobility demand and cause new traffic from this location to typical destinations (work zones, universities) Toll-induced deviations from routes on which they are levied Simulating the impact of road blocks / closed lanes due to construction work Analysing the effects of changes to a signaling plan Predicting the changes in travel time caused by larger travel demand to a city because of an event (convention, festival, championship) Dynamic Traffic Assignment (DTA) brings much more nuance by adding a temporal dimension to Traffic Assignment. It enables us to capture more aspects of the traffic system’s behavior that we observe in reality. Queues are modelled explicitly and spillback effects are taken into account. Traffic Management applications such as Ramp Metering, Route Guidance or signal coordination optimisation need algorithms of their own to find the optimal solutions and may only utilise parts of DTA as a way to evaluate a solution. Those optimisation mechanisms are beyond the scope of DUET, but can play a role in future, more detailed models of a Digital Twin. Figure 1. Traffic volume (vehicle/hour) on each city link Lastly, a local traffic flow model (Cityflows) uses different real-time data sources in the city to better understand multi-modal dynamics of city traffic. Based on different data sources (e.g. signalling data, wifi scanning data, camera object detection, Telraam data), the model estimates the density of traffic and discriminates between different modalities, such as motorised and non-motorised vehicles, in the streets of a certain area. By combining Cityflows output data with other available data sources such as OpenStreetMaps and weather API, we will be able to investigate correlations between flows in the city and the commercial areas some of which may also be found in suburbs. Figure 2. Sample output visualisation of Cityflows Air quality emission models Air quality models use mathematical and numerical techniques to simulate the physical and chemical processes that affect air pollutants as they disperse and react in the atmosphere. Three components that are usually quantified in order to indicate local air quality are particulate matter (PM10), ultrafine particulate matter (PM2.5) and nitrogen dioxide (NO2). In DUET, we’re leaning towards SRM-1 and SRM-2 models given the role that traffic emissions and built environment play in urban air quality. SRM-1 is intended for calculating concentrations of air pollutants near traffic roads in urban areas, also marked as "city roads." Air currents around the buildings influence the air flow in the streets and thus the height of the concentrations of air pollution. This is different to highways and other rural roads where air pollution emitted by traffic does not get stuck between the physical obstacles, but is rather directly carried away by the wind. Standard calculation method (SRM-2) applies to this type of road. Within DUET’s Digital Twin solution, air quality model emissions will be calculated using data on traffic volume and road network. Other variables such as wind speed and wind direction are also being considered. Eventually, the model will be able to calculate dispersion of air pollution caused by traffic for a grid of spatially referenced calculation points. The results will be converted to map images using interpolation or heatmap technology, and placed on top of a map thereafter. Calculations will be performed for several pollutants (PM10, PM2.5, NO2) based on weather information (wind direction, wind speed) and spatial conditions. The results of air quality modeling and simulation will help urban developers and city planners assess the impact of policy decisions prior to committing to costly infrastructure changes. Figure 3. Difference plot for NO2 that shows the effect of a road closure in a city centre Noise emission models Traffic noise in urban areas affects the lives and health of many people. To better manage and minimise its negative impacts, the European Commission requires major EU cities to produce noise maps and the corresponding noise-exposure distributions. In DUET, our intention is to make use of existing open source tools such as NoiseModelling. We already tested NoiseModelling in a desktop environment and are now looking for ways to integrate it into the Traffic Modeller (TraMod) solution. With TraMod, users are able to explore different traffic scenarios by changing multiple road parameters e.g. free flow, speed, capacity. The addition of noise calculation in near real-time would allow them to also explore how, for example, planned roadworks will influence existing noise levels surrounding the construction site. Another tool under consideration is the Urban Strategy Noise Module. It takes into account three different data sources - road traffic, rail traffic, industry - to create noise maps and various outputs necessary for calculating noise-exposure distributions. In countries like Netherlands it is used as a decision support tool for measuring the effects of different planning scenarios involving infrastructure, buildings and sound barriers. Figure 4. Noise Lden map for Industry Next steps: model integration in a Digital Twin environment This is an innovation challenge that needs to be gradually developed and expanded. Not only does it require a library of modules that in principle should be compatible (e.g. finer-resolution models should be disaggregates of lower-resolution versions), it also requires different data sources and calibration procedures to be integrated in the digital twin environment. The way in which models interact with one another depends on how tightly they are intertwined. For instance, travel demand depends on the generalized cost and the generalized cost depends on the travel demand. One of the challenges for DUET will be to find a compromise between decomposition of models into their component models (which allows for more intricate interaction schemes and increases developers flexibility) and the effort involved in providing APIs. A decision needs to be made as to what components of larger models should be exposed to the developers using the platform. It may be necessary to provide the expert user with an opportunity to design their own model-use ‘cookbooks’ where they indicate calculation sequences of different models for some analysis. This is similar to the procedure sequence PTV uses in their macroscopic modelling Software Visum. Alternatively, we may provide external access to the different modelling tools through an API. From a user perspective it’s reasonable to expect that traffic flows predicted by the assignment models somewhat mirror the different observations made by the sensors visualized within DUET. An interesting extension to DUET would be a (semi-) automatic calibration module that stores and monitors sensor data and compares this with the flows generated by a calibrated traffic assignment module or/and uses statistical methods to identify sensors that are behaving unexpectedly. If the differences across the network become too large, a rerun of the model calibration module may be needed to update the Origin Destination tables. This should be a carefully considered step though, as calibration has a substantial computational cost.

  • Digital Twins

    In order to predict the future of 6G technology and its implications for digital twins and in particular the field of urban planning, it can be useful to look back and reflect on how far we have travelled. It can be argued that the antecedents of digital twins were early replicas of objects and processes that provided enhanced functionality. Computer assisted design (CAD) systems in the 1960’s proved to be a step change from manual techniques. Simulators are what enabled NASA to save the crew of Apollo 13 in 1970[1]. NASA continues to heavily use digital twins for its space exploration initiatives including for its rovers on Mars. Two decades later, as the Arpanet evolved into the internet and protocols arose to support the transfer and accessibility of information from anywhere, the World Wide Web became a new model for replicating or mirroring reality. The web, parallel computing, high-speed data, cloud computing and other technologies all enabled further development of this concept. Digital twins were until very recently limited to the aerospace and heavy machinery market, but this is changing and there are now a variety of use cases in diverse sectors such as smart cities, healthcare, insurance and utilities. The use of digital twins in smart cities is now well established with initiatives such as Digital Urban European Twins (DUET) leading the way. With the right data, as DUET has effectively demonstrated, you can model whole urban systems in ways that were unimaginable only a few years ago. The use of digital twins is truly a game changer when it comes to engaging citizens in the formulation of policies, which heralds the new era of responsive cities. It’s a cliché to state that the pace of change and innovation has accelerated exponentially since the CAD systems were introduced. Yet, we only have to think about the current possibilities associated with advanced sensors, AI, advanced machine learning, cloud computing and big data analytics. Whilst it’s reasonable to say that the technology and tooling has moved at pace, the challenges associated with open data and privacy remain a challenge. The urban planner would ideally like to combine open public data with private sources of data to mine new insights, however, there is a lag with the ways of working and regulatory frameworks. Could this disconnect between technology and data be the real conundrum for the next decade? So imagine if we become Nostradamus[2] for a minute……what are the possibilities that 6G technology will deliver by 2030? Staring into the future to make predictions is always fraught with difficulties, however, it’s reasonable to assume the following trends: Samsung predict three key 6G services: Immersive extended reality (XR); high-fidelity mobile hologram; and digital replicas. “With the help of advanced sensors, AI, and communication technologies, it will be possible to replicate physical entities, including people, devices, objects, systems, and even places, in a virtual world,” the white paper[3] states. People will be increasingly working and socialising remotely; no doubt the new norm in a post Covid-19 world. Video calls will be replaced with immersive reality communication enabled by next-generation virtual reality (VR) devices and holographic displays. The Centre for Converged TeraHertz Communication and Sensing (ComSenTe) anticipates[4] that the sixth-generation wireless connectivity will come with speeds of 1 to 100 Gbps. MU-MIMO capability of 100 to 1000 simultaneous independently modulated beams will provide speeds in the tens of terabytes per second; which is 50-times the peak data rate of 5G The main user of 6G technology, according to Samsung, will actually be machines. The firm cites estimations that there will be 500 billion connected devices in the world by 2030 – 59-times larger than the expected world population by that time.[5] Within three to five years, Gartner predicted in 2017[6], "billions of things will be represented by digital twins, a dynamic software model of a physical thing or system." We could even witness autonomous cars and drone delivery as an everyday reality in our cities[7] It will be possible to create a bio-digital twin or alter ego[8]. This will require both aggregated and patient-specific data. This will rely on nanoscale electronic sensors that can work with or alongside a targeted organ or tissue, generating reliable, low-noise data from human patients. So what are the implications of all these changes for responsive cities and in particular DUET? We could have 125 billion IoT connected devices with a wireless network capable not only of providing speed and bandwidth but also handling hundreds of thousands of connections on a single cell. This 6G capability will drive a revolution in transportation, supporting the future of autonomous vehicles, platooning and intelligent roads. Part of how that will work is through wideband inter-car links, which communicate data and measure vehicle locations to the millimeter; these links also anticipate and manage any movement and proximity and avoid collisions. Cities will truly be responsive as they will benefit from this seamless, ubiquitous connectivity enabling the deployment of millions of small, battery-less, connected devices that will provide a new level of data analytics. The urban planner of the future will be able to engage with citizens using Immersive extended reality (XR) ‘tours’ of the city environment. The one probably break on this dynamic future will be access to open data sources which can compliment the vast data gathered from sensors. Perhaps looking further into the future, we may enable citizens to share their own bio wellbeing data so that they can better interact with their urban environment. Understanding patterns of behaviour and what causes stress could deliver additional insights for DUET. Imagine the citizen who can access mobile holograms that improves their ‘experience’ of how they move around the city whether that be by walking, driving or by taking public transport. We could even see of smartphones taken over by pervasive extended reality (XR) experiences[9] via lightweight glasses; an innovation that will deliver an unparalleled resolution, frame rate and dynamic range. [1] https://www.forbes.com/sites/forbestechcouncil/2020/07/08/how-far-bio-digital-twins-have-come-and-what-may-be-next/#7e49a936b92e [2] https://en.wikipedia.org/wiki/Nostradamus [3] https://cdn.codeground.org/nsr/downloads/researchareas/6G%20Vision.pdf [4] https://www.smartcity.press/advent-of-6g-technology/ [5] https://www.independent.co.uk/life-style/gadgets-and-tech/news/6g-samsung-digital-twins-holograms-a9620071.html [6] https://www.gartner.com/smarterwithgartner/gartners-top-10-technology-trends-2017/ [7] https://iot.eetimes.com/5g-is-already-here-6g-will-arrive-soon/ [8] Ibid - Forbes [9] https://www.oulu.fi/university/news/6g-white-paper

  • A Traffic Modeler Based App Comes First in the Dubrovnik INSPIRE Hack

    We are pleased to announce that the team which built a solution around Traffic Modeler, one of our visualisation tools, was awarded first place at the prestigious Dubrovnik INSPIRE Hackathon 2020. The demo created by the team focused on Františkovy Lázně, a small Czech town of about five thousand inhabitants. The results show that interactive traffic modelling can improve traffic planning in any city no matter its size. The demo was built in three steps. First, the team had to gather sufficient data on traffic network and traffic generators to be simulated. Next, the traffic model was calculated and imported into a spatial database for further processing by the Traffic Modeler. In the final step, different traffic scenarios were modeled and simulated using the tool's Application Programming Interface. The demo competed against 11 other prototypes, and won. The jury awarded it first place because it best demonstrated the innovation potential along with impressive readiness level, sustainability and interoperability. The jury also said that Traffic Modelling from web browser, as the demo is officially known, combined different data sources better than competitor ideas. Commenting on the results, Daniel Beran, the team's leader, said: "We are extremely happy with the outcome. First place at the Inspire Hack is an excellent achievement. But we don't want to stop here. Our plan is to develop a web based Graphical User Interface similar to one already being used by the DUET pilot city Pilsen. When ready, the app will allows users to calculate various traffic scenarios in near real time. They will only need a web browser and good internet connection to start experimenting." The Pilsen Traffic Modeler App Congrats to Daniel and his team: Jan Blahník, Petr Trnka, Eva Podzimková, Zuzana Soukupová, Jan Sháněl, František Kolovský, Jan Šťastný, Jan Martolos, Karel Jedlička

bottom of page