Logo

    embracingdigital

    Explore " embracingdigital" with insightful episodes like "#189 Parallel Works AI Workload Automation", "#188 Surveying Black Swan Events with Digital Transformation", "#187 GenAI RAG Details", "#186 Introduction to GenAI RAG" and "February 11, 2024" from podcasts like ""Embracing Digital Transformation", "Embracing Digital Transformation", "Embracing Digital Transformation", "Embracing Digital Transformation" and "Embracing Digital This Week"" and more!

    Episodes (43)

    #189 Parallel Works AI Workload Automation

    #189 Parallel Works AI Workload Automation

    In a data-driven world where technology is king, a lively discussion unfolding between Darren Pulsipher, host of Embracing Digital Transformation, and Matthew Shaxted, president of Parallel Works, navigated their way through the captivating sphere of High-Performance Computing (HPC) and its monumental role in machine learning and AI. 

    This episode delicately examines this rapidly advancing field, shedding light on its profound influence on our lives. Amid their discourse, two main areas were markedly central - the evolution of HPC, presenting its potential efficiencies and challenges, and the company Parallel Works, born out of a profound need to democratize industry-specific workloads using high-performance computing models.


     The Evolution of High-Performance Computing


    In the last ten years, high-performance computing (HPC) has undergone a significant transformation. Shaxted highlights that current technology allows us to fit almost five times more cores on a single chip than we could a decade ago. Each core represents a distinct processing unit capable of functioning independently of the other cores. This results in a significant surge in performance power, providing an affordable and efficient execution methodology that was previously only possible through high-cost supercomputing.


    Although there have been significant advancements in the field of high-performance computing (HPC), setting up and configuring advanced HPC clusters is still an extremely difficult task. The challenge is not only limited to the hardware aspect, but it also includes the complex process of task setup. This requires a detailed knowledge of parallel computing, which adds to the steep learning curve.


     Democratizing HPC with Parallel Works


    Shaxted and his co-founder, Mike Wild, had a vision to revolutionize the High-Performance Computing (HPC) industry, and they set out to achieve it by creating Parallel Works. The idea behind Parallel Works was to democratize industry-specific workloads and make them accessible on a commercial scale. The primary objective of Parallel Works was to simplify the HPC process and make it more user-friendly. 


    This initiative aims to simplify the computational complexities of High-Performance Computing (HPC) for professionals in different industries. Its goal is to make this technology and its numerous advantages accessible to as many people as possible, regardless of their computing background. By doing so, it will significantly reduce the learning curve and make it easier for more people to benefit from HPC.


     The Future of HPC


    After the conversation between Shaxted and Pulsipher concluded, it was clear that HPC (High-Performance Computing) has a bright future ahead. HPC can significantly improve computational speed, provide access to advanced technologies and support the development of innovative solutions in machine learning and AI.


    Echoing this thought, Shaxted acknowledges the ever-evolving role of HPC and its potential to drive innovation. It remains a crucial component for pioneering solutions, paving the way towards a more efficient and intelligent future.


    Businesses and industries can benefit greatly from the integration of high-performance computing, as they ride the wave of digital transformation. This approach is considered the way forward by Pulsipher and Shaxted, as it provides the necessary computational boost to data-intensive industries, and also democratizes access for all.

    #188 Surveying Black Swan Events with Digital Transformation

    #188 Surveying Black Swan Events with Digital Transformation

    Darren interviews Dan Berges about his journey through the COVID-19 pandemic to transform the Berges Institute, a Spanish language school in New York City. Despite initial challenges, the shift reshaped work dynamics, broadened their global reach, and highlighted the importance of understanding business processes and coding for successful digital transformation.

    In an era of rapid technological advancements, digital transformation no longer remains a luxury; it's now a necessity to ensure business continuity. A testament to this reality is the story of the Berges Institute, a Spanish language school for adults located in the heart of New York City.


     Swift Pivot to Online Learning


    With the onset of the COVID-19 pandemic, the need for a rapid transition from traditional to online classes became imminent. Leveraging their digital infrastructure, Berges Institute managed to make this shift seamlessly within a time span of two days.


    Influenced by the precautionary measures and social distancing rules, nearly 70% of students had already begun taking online courses. However, the decision to completely close in-person classes in cities like New York and Chicago was both a challenging and game-changing turning point for the establishment. Despite fears of losing students due to the lack of in-person interaction, the majority remained steadfast and loyal to the online program.


     Reshaping Work Dynamics and Broadening Reach 


    One of the positive aspects of this dramatic transition is the reshaping of work dynamics. Berges Institute's education providers were given the flexibility to teach from their homes or even from across the world. This new operational mode was positively embraced by the teaching staff, resulting in an improvement in performance and overall work satisfaction.


    Simultaneously, the shift to online classes introduced Berges Institute to a broader demographic. No longer limited by the geographic bounds of the United States, they now had the opportunity to tap into various English speaking markets globally. They have reached out to students in Europe, Australia, Canada, India, and the Emirates, thereby expanding their influence as a renowned Language institute.


     Understanding Business Processes and Coding


    Despite the successful transition, the road to digital transformation was not devoid of challenges. Operating a code base as a small business was seen as a significant hurdle. The founder and managing director of Berges Institute, Dan Berges, emphasized the need for business owners to possess at least a basic understanding of coding and programming. 


    By investing time to understand the business's processes prior to jumping into the coding phase, businesses can ensure that the code base created is maintainable. Building a strong relationship with developers who profoundly understand your business logic is indispensable during this process.


     Adapting Business Workflows for Digital Structure


    The key takeaway from the Berges Institute's digital transformation story is the importance of adapting existing business workflows to complement your impending digital structure. It's vital to understand that a traditional business model can't always be translated accurately into a digital platform.


    In conclusion, the navigation journey from brick and mortar to online business may initially seem daunting. But with the right tools, mindset, and understanding of your business's core processes, a successful digital transition is certainly achievable. As Berges Institute has demonstrated, embracing digital transformation is more than just a survival strategy — it's a path to growth and expansion in today's digital era.

    #187 GenAI RAG Details

    #187 GenAI RAG Details

    In part two of his interview with Eduardo Alvarez, Darren explores the use of GenAI LLMs and RAG (Retrieval Augmentation Generation) techniques to help organizations leverage the latest advancements in AI quickly and cost-effectively.

     Leveraging Language Model Chains


    In a landscape where accessible technologies are ubiquitous, operational efficiency sets an application apart. Be that as it may, handling an assortment of tasks with a single language model does not always yield optimal results, bringing us to the Language Model (LM) chains concept. 


    LM chains involve the integration of several models working simultaneously in a pipeline to improve user interaction with an application. Just as every task demands an integrating approach, every segment of your application may perform best with an individualized language model. Indeed, there's no one-size-fits-all policy when it comes to language models. Several real-world implementations are already capitalizing on the strength of multiple LMs working in harmony. 


     System Optimization and Data Veracity


    The holistic optimization of the system is an integral part of leveraging LM chains. Everything from choosing the perfect moment to deploy a large language model to selecting the ideal architecture for computing forms an essential part of this process. The right decisions can dramatically bolster system performance and improve operational efficiency.


    Integrating multiple models also opens novel avenues for research and development, particularly around data veracity within such setups. It poses fascinating challenges and opportunities ripe for exploration and discovery. 


     Maintaining Discreet Access to Data Privacy


    When discussing data privacy, it is essential to understand the balance between utilizing more extensive institutional databases and preserving private user information. Eduardo suggests maintaining discretionary control over database access, ensuring operational superiority and data privacy. 


     Rising Fusion of AI and Real Data Ops


    Predicting future trends, Eduardo anticipates a merger of accurate data and AI ops, which would resemble the blend of operational excellence and tool integration by configuration management engineers in the '90s. This blend translates into distributed heterogeneous computing in AI and shapes the future of AI ops.


     Concluding Thoughts


    Technology should invariably strive to simplify systems without sacrificing performance or efficiency. A thorough understanding of the available tools is a prerequisite to successfully leveraging them. Incorporating the LM chains in AI applications is a step in this direction, paving the way for an enriched user experience. Our conversation with Eduardo Alvarez underscores the importance of these insights in propelling the intriguing landscape of AI.

    #186 Introduction to GenAI RAG

    #186 Introduction to GenAI RAG

    In a rapidly evolving digital sphere, generative Artificial Intelligence (GenAI) is capturing the attention of technophiles across the globe. Regarded as the future of AI technology, GenAI is broadening boundaries with its potential for accurate simulations and data modeling. A prominent figure in this arena, Eduardo Alveraz, an AI Solution Architect at Intel and former geophysicist, holds invaluable insights into this fascinating world of GenAI. 

     An Intersection of Geophysics and AI 


    Eduardo’s journey from geophysics to artificial intelligence provides an exciting backdrop to the emergence of GenAI. As he transitioned from a hands-on role in the field to an office-based role interpreting geophysics data, Eduardo was introduced to the ever-intriguing world of machine learning and AI. His first-hand experience collecting and processing data played a pivotal role as he explored the tech-saturated realm of AI. This journey underscores how disciplines often perceived as separate can contribute significantly to the development and application of AI technology.


     Bridging the Gap between Data Scientists and Users


    Generative AI presents several promising benefits, a key being its potential to act as the bridge between data scientists and end-users. In traditional setups, a significant gap often exists between data scientists who process and analyze data and the users who leverage the results of these actions. GenAI attempts to close this gap by providing more refined and user-friendly solutions. However, it's crucial to acknowledge that GenAI, like any technology, has limitations. The thought of storing sensitive data on public cloud platforms is indeed a daunting prospect for many businesses.


     Enhancing Interaction with Proprietary Data


    Despite concerns around data security, mechanisms exist to securely enhance models' interaction with private or institutional data. For instance, businesses can train their models on proprietary data. Still, this approach raises questions about resource allocation and costs. These interactions emphasize the significance of selectively augmenting data access to improve results while maintaining data security.


     The Exciting Potential of GenAI 


    The conversations around GenAI hold promise for the future of AI. This period of rapid advancement brings countless opportunities for innovation, growth, and transformation. As more industries adopt this revolutionary technology, it's clear that Generative AI empowers the world by sculpting the landscape of artificial intelligence and machine learning. This exploration instigates a more profound interest in GenAI and its potential possibilities. Our journey into the AI landscape continues as we unravel the mysteries of this exciting technological frontier.


     Extending GenAI with Retrieval Augmented Generation (RAG)


    GenAI has some limitations that include data privacy, long training times, and accuracy of results. This is because large language models require extensive data for training. Context becomes crucial, particularly in language processing, where a single word can have multiple meanings. RAG architectures help in augmenting user prompts with context from a vector database, which reduces the training time, enhances data privacy, and limits the wide out-of-the-box context of LLMs.

    February 11, 2024

    February 11, 2024

    Digital Transformation News for the week of February 11, 2024 includes stories on data management, ubiquitous computing, and advanced communications. The articles cover topics such as OEMs moving away from VMWare, Super Bowl networking, and the impact of AI on data management. Blog: https://embracingdigital.org/brief-EDW54-en Video: 

    11 Febbraio 2024

    11 Febbraio 2024

    Le notizie sulla Trasformazione Digitale per la settimana dell'11 febbraio 2024 includono storie sulla gestione dei dati, il calcolo ubiquo e le comunicazioni avanzate. Gli articoli trattano argomenti come gli OEM che si allontanano da VMWare, il networking del Super Bowl e l'impatto dell'IA sulla gestione dei dati. Blog: https://embracingdigital.org/brief-EDW54-it Video: 

    11 de febrero de 2024

    11 de febrero de 2024

    Las noticias de la Transformación Digital para la semana del 11 de febrero de 2024 incluyen historias sobre la gestión de datos, la informática ubicua y las comunicaciones avanzadas. Los artículos abordan temas como los OEM que se alejan de VMWare, las redes del Super Bowl, y el impacto de la IA en la gestión de datos. Blog: https://embracingdigital.org/brief-EDW54-es Video: 

    #185 History of Data-centrical Applications (revisited)

    #185 History of Data-centrical Applications (revisited)

    The first episode of this podcast was released 185 episodes ago. In this episode, the host Darren Pulsipher redoes episode one to provide updated information on the history of data-centric application development. He discusses how new technologies like edge computing and AI have impacted data generation and the need for better data management.

     Early Data Processing 


    In the early days of computing, applications were built to transform data from one form into another valuable output. Early computers like the ENIAC and Turing's machine for breaking the Enigma code worked by taking in data, processing it via an application, and outputting it to storage. Over time, technology advanced from specialized hardware to more generalized systems with CPUs and networking capabilities. This allowed data sharing between systems, enabling new applications.


     Emergence of Virtualization


    In the 1990s and 2000s, virtualization technology allowed entire systems to be encapsulated into virtual machines. This decoupled the application from the hardware, increasing portability. With the rise of Linux, virtual machines could now run on commodity x86 processors, lowering costs and barriers to entry. Virtualization increased ease of use but introduced new security and performance concerns.


     The Rise of Cloud Computing 


    Cloud computing is built on virtualization, providing easy, on-demand access to computing resources over the internet. This allowed organizations to reduce capital expenditures and operational costs. However, moving to the cloud meant security, performance, and integration challenges. Cloud's pay-as-you-go model enabled new use cases and made consuming technology resources easier overall.


     Containerization and New Complexity


    Containerization further abstracted applications from infrastructure by packaging apps with their runtimes, configuration, and dependencies—this increased portability and complexity in managing distributed applications and data across environments. Locality of data became a key concern, contradicting assumptions that data is available anywhere. This evolution resulted in significant new security implications.


     Refocusing on Data 


    To address these challenges, new architectures like data meshes and distributed information management focus on data locality, governance, lifecycle management, and orchestration. Data must be contextualized across applications, infrastructure, and users to deliver business value securely. Technologies like AI are driving data growth exponentially across edge environments. More robust data management capabilities are critical to overcoming complexity and risk.


     Security Concerns with Data Distribution


    The distribution of data and applications across edge environments has massively increased the attack surface. Principles of zero trust are being applied to improve security, with a focus on identity and access controls as well as detection, encryption, and hardware roots of faith. 


     The Edgemere Architecture


    The Edgemere architecture provides a model for implementing security across modern complex technology stacks spanning hardware, virtualization, cloud, data, and apps. Applying zero trust principles holistically across these layers is critical for managing risk. Robust cybersecurity capabilities like encryption and access controls are essential for delivering business value from data in the new era of highly distributed and interconnected systems.

    #184 Effective Change Management with SEAM

    #184 Effective Change Management with SEAM

    Digital transformation can be a challenging task for organizations, and its success or failure can have a significant impact on a company's future, regardless of its size. In this week's episode, Dr. Madeleine Wallace shares her insights into the SEAM framework, a systematic approach to adopting digital transformation.

    In the rapidly evolving digital landscape, businesses are constantly required to adapt and innovate. One individual who deeply understands this changing landscape is Dr. Madeleine Wallace, who experienced first-hand the significant impact of digital transformation while growing up in rural Peru. Her experiences have shaped her professional approach, leading her to develop the Snapshot Evaluate, Act, and Monitor (SEAM) Framework to facilitate effective organizational change.


     SEAM Framework: Setting the Stage for Change


    Digital transformation is an inevitable reality for contemporary companies and can either lead to tremendous growth or an abrupt downfall depending on how well businesses navigate this era of change. Dr. Wallace's past experiences, notably the closure of her parent's vocational school due to failed adaptation to digitalization, made her realize the central role of readiness in the process of transformation. It set the stage for her development of the SEAM Framework.


    The SEAM approach proposes an action-focused plan that kickstarts with taking a realistic snapshot, a detailed assessment, of the existing state of a corporation. It encourages leaders to ask insightful questions about what's functioning well and what isn't, analyzing strengths, weaknesses, and the obstacles to change. The overall aim is to establish a truthful picture of the organization, defining the starting point for a successful change strategy.


     Evaluation and Actuation: Implementing the SEAM Approach


    Evaluation and actuation are the next crucial steps in the SEAM Framework. Once a snapshot has been taken, the evaluation phase utilizes this information to determine the steps required for a successful transformation. It presents an opportunity to develop a detailed plan, noting the representation of barriers, and defining the actions needed to overcome these obstacles.


    During the actuation phase, the organization moves forward with implementing these proposed changes. At this stage, recognition, and acceptance of the identified issues become critical. Dr. Wallace emphasizes the need to be open to address underlying problems and, if needed, bring in external consultants to provide expertise beyond the existing capabilities of the organization.


     Monitoring the Implementation


    Following the implementation comes the monitoring phase. This stage involves tracking and reviewing all changes to ensure their effectiveness and positive impact. It serves as a way to measure the success of the transformation, and if required, adjust the strategies to better achieve the objectives.


     Digital Transformation: A Necessity


    Acknowledging and addressing the potential difficulties and obstacles to change is a key ingredient in successful digital transformation. Particularly now, the shift to digital integration is not an easy task. It often requires bringing in external experts to help identify potential blind spots. Adapting Dr. Wallace's SEAM framework can provide an insightful and practical approach to assessing and implementing change efficiently.


    Dr. Wallace's insights on organizational change in the digital age reflect an important message for businesses today: embrace digital transformation, assess existing practices, act upon necessary changes and monitor their effectiveness. After all, readiness and adaptability are the keys to surviving and thriving in the digital era.

    #183 Data Management in Material Science and Manufacturing Industries

    #183 Data Management in Material Science and Manufacturing Industries

    In a rapidly evolving technological landscape, leaders from diverse sectors apply data analytics, machine learning, and artificial intelligence to their operations. Today, look deeper at a company driving digital transformation in the manufacturing industry – Ori Yudilevich, the CTO of Materials Zone.

     Bridging the Gap between Physical and Digital in R&D


    Materials Zone is focused on the niche yet significant aspect of material science, specifically in the manufacturing industry. Given the considerable role of materials in product development, effectively managing data becomes crucial. Analogous to a cooking recipe, material science involves a nuanced integration of ingredients (materials) passed through a process to produce the final product.


    However, this area has historically been ad hoc, relying on trial, error, and intuition. Consequently, the knowledge acquired during this process often gets lost due to insufficient documentation or employee attrition. In our modern, interconnected world, where product development processes often span multiple locations, even countries, establishing structured methodologies to prevent knowledge loss is critical. 


    One of the techniques highlighted by Yudilevich is addressing the "trucking factor," which suggests that if the only person who knows how to do a particular task got hit by a truck, it could potentially derail the entire project. Hence, having at least one other person aside from the primary individual who can perform the task could lower the team's vulnerability.


     Capturing Complexities of Material Science Data


    The field of material science generates complex data, often unstructured and difficult to capture using traditional data tables and databases sufficiently. To visualize this, consider data as a graph where raw materials turn into end products. The innumerable interactions between the various constituents give rise to multiple unique dimensions within the data.


    Moreover, a seamless translation exists within the manufacturing realm – From the explorative research to the production phase, which demands stabilization and consistency. Collating data from these phases into a unified repository can enhance the R&D process by centralizing information, aiding inter-phase learning, and accelerating new product development.


     Integrating Data Science into Manufacturing


    While data science has permeated many industries, companies focused mainly on product development in the physical world often find setting up dedicated data departments or integrating analytical tools inefficient and costly. This is where Materials Zone's solution comes into play, making data science, machine learning, and statistical tools accessible to businesses unfamiliar with these areas.


    They offer out-of-the-box tools accompanied by webinars and training sessions for easy adoption, thus reducing the barriers to integrating data science into manufacturing practices. Surprisingly, even Fortune 500 companies who lack the necessary digital skills can benefit significantly from such solutions.


     As We Step Forward


    As the product development process becomes more complex and global, the critical nature of systematic data management combined with technological innovation is coming to the fore. Companies like Materials Zone are paving the path, guiding businesses to bridge their physical-digital knowledge gap, bolster their manufacturing practices, and ensure future success.


    For more information, check out https://materials.zone. 

    #182 Zero Trust Data Assurance

    #182 Zero Trust Data Assurance

    The need for robust data security strategies has grown exponentially in the digital age, becoming a top priority for businesses around the world. Cybersecurity expert and CTO of Walacor, Walter Hancock, offers keen insight into the importance of data integrity and a zero trust approach in current cybersecurity regimes. 

     Unmasking Assumptions About Data Security


    In the past, people have had implicit trust that their data is secure and their privacy is protected. However, this trust is often based on an outdated model that no longer aligns with the current technological landscape. The increasing number of data breaches and cyber attacks has made it evident that data security is more critical than ever, and the precautions that were considered adequate in the past may no longer be sufficient.


    Today, data is vulnerable to threats not only from external hackers but also from within organizations. It is essential to understand that a data breach can have significant implications, ranging from financial losses to reputational damage. Therefore, it is crucial to implement a zero-trust approach to data management, which means that every request for access to data must be verified before access is granted. Reliable data audits are also necessary to ensure that the data input matches the output and that there is no unauthorized access to sensitive information.


     Implementing a New Age of Data Security with Walacor


    Walacor provides a unique solution to improve our understanding of data security. They offer an automatic and full-proof audit log that is immutable, meaning that once data is entered, it can never be altered or deleted without being detected. This feature makes it incredibly easy to track every change made to the system, which is critical in maintaining a secure environment.


    By providing transparency and traceability, Walacor's solution helps organizations to meet legal compliance requirements and mitigate risks. For instance, in a legal dispute, an immutable audit log can serve as a reliable source of evidence, as it cannot be tampered with. Furthermore, in the event of a data breach, an immutable audit log can help identify the source of the breach and the extent of damage caused.


    Overall, Walacor's innovative approach to data security, with its 100% immutable audit log, offers a promising solution for organizations looking to enhance their cybersecurity posture.


     Shaping the Future of Data Intelligence


    The increasing risk of data breaches means that we need to move away from using multiple layers of data security to a more integrated data protection solution. This type of solution lays the foundation for a Zero Trust environment, which significantly reduces the risk of cyber threats and vulnerabilities. By adopting this approach, we can streamline our data protection methods and ensure better data integrity.


    The development of data intelligence in the form of data integrity and security opens up new possibilities for digital businesses. Improved data protection methods, better data integrity, and a reduction in potential cyber threats are just a few of the benefits that are set to transform the digital landscape. Among these, the talk of the town is Walacor's unique approach to data integrity and zero trust, which marks a significant milestone in how we approach data security now and in the future.


    Check out more information from (https://walacor.com)https://walacor.com]

    #181 Zero Trust in 5G

    #181 Zero Trust in 5G

    In the midst of the growing adoption of 5G technologies worldwide, the experts in the recent episode of Embracing Digital Transformation podcast delved into the integral topic of Zero Trust in 5G security. Host Darren Pulsipher welcomed 5G advanced communications expert Leland Brown, VP of Marketing at Trenton Systems Yazz Krdzalic, and Ken Urquhart, a physicist turned cybersecurity professional from Zscaler, to discuss the integration and advancement of 5G technology, along with its challenges and breakthroughs.

     The Expansive 5G Landscape and The Lonely Island Approach


    The world of 5G technology is rapidly evolving, and as a result, there are a lot of insightful discussions taking place around merging Operational Technology (OT) and Information Technology (IT). Yazz Krdzalic describes the concept of the "Lonely Island approach." This approach refers to the tendency of different entities to focus too heavily on solving their individual problems, which has often led to the stalling of growth in custom hardware in telecom infrastructure. 


    The need to break away from this individualistic approach and re-establish a collective architectural framework that can scale and flex with different use cases is becoming increasingly apparent. With the emergence of 5G technology, there is a need for a collaborative approach that can accommodate the various requirements of different entities. The collective approach will help to ensure that the infrastructure is flexible and scalable, making it easier for entities to integrate their technologies and applications into the network. 


    The discussions around merging OT and IT are also gaining momentum, and it is becoming clear that the collaboration between these two domains is essential for the success of 5G technology. As the technology continues to evolve, it is expected that there will be more debates and discussions around how to take advantage of the opportunities presented by 5G, while also addressing the challenges posed by the emerging technology. Overall, the future of 5G technology looks bright, and the collaboration between different entities will play a critical role in its success.


     Transitioning to Zero Trust Security


    As technology continues to evolve, security concerns have become a growing issue for individuals and organizations alike. In order to address these concerns and ensure a safe and secure environment, a collective architectural framework is needed. This framework includes the implementation of advanced security models, such as Zero Trust Security. However, transitioning to these models is not always easy. It requires letting go of older methods of operating and ensuring that all technological modules are synchronized and functioning properly. In the past, it was the customers who were burdened with the responsibility of integrating all the pieces. Fortunately, with the adoption of a more evolved approach, the onus of integration has been considerably reduced for the customers, making the implementation of Zero Trust Security and other advanced security models a much smoother process.


     Finding The Common Ground In 5G Usage


    The development of 5G technology has been a game-changer in both commercial and military sectors. However, there are specific requirements that differentiate the commercial and military usage of 5G. Commercial deployments of private 5G networks are largely static, whereas military deployments need to be mobile. 


    Leland Brown, a prominent expert in the field, has discussed the complexities of finding a common architecture that could cater to both these needs. The challenge was to create a final solution that elegantly fulfilled these requirements. It was important to ensure that the solution was efficient and effective for both commercial and military use cases. 


    The development of such solutions is crucial to ensure that 5G technology is utilized to its fullest potential and can cater to the diverse needs of different industries.


     Wrapping up


    The world of technology is constantly evolving and improving, and the advent of 5G technology and Zero Trust security is a testament to this. However, implementing these advancements can be challenging due to technical and cultural obstacles. Thankfully, experts like Leland Brown, Ken Urquhart, and Yaz Krdzalic are working to streamline the integration of 5G technology and Zero Trust security, making the journey towards a safer and more efficient technological future a little easier for everyone. Their insights and expertise are shedding light on the continuous journey of evolution and improvement in the world of technology.

    #180 Generative AI in Higher Education (Revisited)

    #180 Generative AI in Higher Education (Revisited)

    In this week's episode of Embracing Digital Transformation, Darren Pulsipher interviews guest speaker Laura Newey about her fascinating journey through the critically emerging world of Generative AI, particularly in the education sector. Covering the transformation of her teaching experience and enriching her students' learning outcomes through AI, she extensively analyzed adapting to modern education dynamics.

     How Generative A.I. Enhances the Classroom Experience


    Generative AI is rapidly weaving into educational curriculums, impacting how educators approach teaching and fundamentally enhancing the learning experience. According to Newey, this much-debated technology is not merely a form of plagiarism but a brilliant tool that augments and revitalizes educational methodologies. Encouraging students to use A.I. in thinking tasks, she emphasizes fostering and harvesting critical thinking skills in our breakneck digitizing society.


    Rather than lingering as passive participants, she advocates for students to become active players, analyzing the results generated by AI and considering the quality and substance of their input information. The shift underlines the importance of understanding, research, and analysis over mere result generation.


     Transition From Traditional Teaching 


    Newey's progressive approach dramatically diverges from the conventional methods that most educators cling onto, especially considering general resistance towards integrating Generative A.I. in educational settings. However, she emphasizes the inevitability and necessity of adopting digitalization for the overall advantage of students.


    Comparing this transition with the initial resistance to utilizing the internet as a teaching tool indicates where we stand today. Generative AI, like any other evolving technology, necessitates incorporation within the curriculum and demands regular updates for relevance in this fast-paced digital landscape.


     Balancing Innovation and Ethics


    With progression and innovation, Newey also addresses the ethical considerations inherent to this change. She shares several instances where students, unknowingly or subtly, submitted AI-generated essays. Thus, she emphasizes educators' need to vigilantly balance technological embracement and ethical usage.


    She firmly believes that students can use A.I. as a productive tool, but the responsibility also falls upon educators to guide them toward maintaining academic integrity simultaneously.


     Conclusion: Paving the Way Towards an A.I. Enhanced Education System


    The incorporation of Generative AI in education, while met with resistance, is a profound indication of the shifting educational landscape. As Newey illustrates, successful integration of AI in education can significantly enhance learning experiences and the development of essential skills, securing our students' readiness for a future shaped by digital transformation.

    #179 Leveraging Generative AI in College

    #179 Leveraging Generative AI in College

    In this episode, Darren interviews his daughter who recently completed her first semester in college about her experience using generative AI technology in her academic studies. She describes the challenges and successes associated with utilizing this transformational tool.

     Navigating the Intricacies of Academic Integration with Generative AI 


    In the fast-paced world defined by the rapid digital transformation, it is increasingly noticeable how AI constructs are becoming inextricable parts of everyday life. One captivating area where their impact can be felt is in the field of academics. This blog post intends to delve into the potential of generative AI with firsthand experiences from a student, Madeline Pulsipher, at BYU Idaho. 


    Applying generative AI assistance such as ChatGPT in academic work reveals exciting possibilities. When utilized responsibly, this powerful tool can provide a digital advantage in brainstorming ideas, generating essay outlines, and self-assessing your work against grading rubrics.


     Generative AI - Tool or Trick?


    The question of whether or not utilizing AI for academic tasks happens to be cheating presents an intriguing aspect. Madeline rightly points out that using AI to facilitate a process or guide along should not be equated with cheating. Cheating would imply composing an essay solely by the AI and taking credit for the unaided work. 


    However, we must create distinguishing guidelines as we broach newer technological methods. Defining what constitutes responsible use versus cheating when incorporating AI in academics is an essential task that educational institutions must work on and set formally and strenuously.


     The Efficiency of AI in Self-assessment


    An intriguing usage of AI has stopped everyone in their tracks - self-grading her work based on the established marking rubric before submission. Madeline's experiments with this approach bore fruitful results, with her securing As in all her AI-assisted essays. This signifies the newfound potential of AI to assist not just in mechanical tasks but also in the qualitative improvement of work.


     Prospects and Ongoing Debates


    The use of AI in academic contexts has been debated for quite some time. While it can be a valuable tool for enhancing learning outcomes and improving productivity, it's important to remember that AI cannot replace the human intellect. Every new technology has benefits and drawbacks, and AI is no different.


    Although generative AI can produce content, it lacks the human touch that is essential in communication. It cannot replace human teachers in explaining complex concepts, as it needs the ability to understand the nuances of human conversation. Therefore, while AI can be a valuable asset in certain areas, it must maintain the value of human interaction and expertise.


     Improving Social Interactions


    The COVID-19 pandemic has disrupted the lives of many students beginning their freshman year in college this year. The negative dating trend among teenagers has been further exacerbated during the pandemic. Due to the lack of social interactions, the current generation misses many critical experiences, such as breaking up, first kissing, or asking for another date.


    Madeline sought advice from her friends on how to let down a guy who wanted another date but received conflicting advice. Then, she turned to ChapGPT, an impartial and unemotional AI-powered assistant, for advice. She used ChapGPT's suggestions as a guide to develop her approach.


    This ability to use Generative AI as an advisor rather than a definitive authority will be crucial for the next generation to leverage the power of AI in academic and social situations.


     The Future of AI in Academics


    Various concerns continue to hover around integrating AI into academics - worries about cheating, the lack of established institutional policies, and the possibility of fostering a short-cut culture. However, it is undeniable that generative AI is a tool many students are resorting to, and its full potential within academia still needs to be thoroughly explored.


    Clearly, the stringent line between cheating and appropriate use needs to be carefully charted. But once this line has been established, the success of AI as a tool in academic paradigms looks promising. If wielded correctly, it can become a substantial part of an educational toolkit - shaping competent individuals well-equipped to handle AI in their professional habitats.

    #178 Zero Trust networking with OpenZiti

    #178 Zero Trust networking with OpenZiti

    On this episode, Darren interviews Phillip Griffith, a community leader of the open-source project OpenZiti. They discuss the importance of Zero Trust networking in modern IT networks.

    # Unveiling the Dynamics of Zero Trust Networking and Overlay Networks


    As the digital age progresses, the conversation around network security takes a frontline position. In a rapidly evolving digital landscape, Zero-trust networking and Overlay networks are critical strategies for tackling current security challenges. Here, we delve into these concepts, how they shape our digital systems and provide an understanding of their potential benefits and applications. 


     A Closer Look at Zero Trust Networking 


    Zero-trust networking is a mindset that places security as a prime concern in designing and operating digital systems. Its critical aspect is the presumption of potential threats from every part of the network, irrespective of how secure they may appear. This approach moves away from the traditional fortress-style concept in security and leads to more robust networks that do not rely solely on a single firewall's protection. 


    Firstly, the beauty of zero-trust networks lies in their capacity to work effectively and securely, presenting an advantage for software developers and engineers. Security becomes an enabler rather than a hindrance to the software development process. With zero-trust networking, developers can focus on feature development without worrying about blocked ports or consulting network teams—a significant step towards faster market releases. 


    Nevertheless, zero-trust networking doesn’t eliminate the need for perimeter defenses or firewalls. The zero trust strategy assumes a possible network compromise; therefore, it calls for defense layering instead of solely relying on elementary perimeter defense. 


     The Rise of Overlay Networks 


    Amid the rising security threats and data breaches, overlay networks are emerging as an invaluable tool. These software-defined virtual networks provide an extra layer of security compared to underlay networks such as routers or firewalls. 


    Overlay networks like VPN and Wireguard allow secure communication between resources even when the underlying network has been compromised. They offer attractive features, like self-reorganization based on conditions, giving them temporary characteristics. These networks also come with options for secure in-application or data system communication—additionally, a clientless endpoint option bolsters user connectivity, requiring no software installation on individual devices. 


    Overlay networks provide flexibility concerning deployment. There’s no need to rewrite your application code, as the code for the overlay network can be embedded directly into the application code. Alternatively, a virtual appliance can be deployed instead if you want to avoid altering your application. This convenience, combined with added security, sets overlay networks up as future-proof solutions to network security. 


     The Power of ZTN and OpenZiti Solutions 


    Zero Trust networking (ZTN) offerings, like Open Zero Trust (Open Ziti), provide competent solutions in zero trust and overlay networking. They deliver robust Zero Trust principles into the field of overlay network solutions. 


    ZTN, for instance, brings its identity system to the table, perfect for edge IoT devices unable to access typical identity services. It offers secure data transmission through mutual tunneling and an intelligent routing fabric that determines the most efficient path from point A to point B. On the other hand, Open Ziti facilitates multiple use cases, managing east-west and north-south connections smoothly and securely. It integrates well with service meshes to provide high-level security. 


    Thus, adopting such holistic security measures becomes necessary as we step into the digital era. ZTN and OpenZiti present practical solutions for those embracing the Zero Trust model, with advantageous features ranging from identity management to secure connectivity. No doubt, these innovations are setting the benchmarks for network security.

    #177 Zero Trust Data with SafeLiShare

    #177 Zero Trust Data with SafeLiShare

    During this episode, Darren and SafeLishare CEO Shamim Naqvi discuss how confidential computing can be employed to create managed data-sharing collaborative environments in the cloud.

     The SafelyShare Revolution in Data Sharing and Confidentiality 


    Data sharing has always been a key issue when dealing with sensitive and confidential business information. The advanced technological solutions including SafelyShare have been tackling this problem, offering a controlled system for data access without violating data protection. The fundamental basis of this system is "Zero Trust", a unique strategy that doesn't assume trust for anyone and keeps control and monitoring at its core. 


     Harnessing the Power of Secure Enclaves


    A critical aspect of SafelyShare's approach is the use of secure enclaves, or trusted execution environments, ensuring a safe space for data sharing, authentication, and management. These enclaves are created with the help of specific confidential computing chipsets that fully enclose the shared data. With encryption practices implemented outside of these enclaves, data can only be decrypted once it enters the enclave, thereby providing an end-to-end encryption policy. The output exiting the enclave is also encrypted, adding another layer of security to protect the data.


    But challenges exist within this process. Not all online services incorporate a secure enclave in their operation, leading to a high demand for a more flexible, effective solution to confidential computing.


     The Hybrid Approach of Confidential Computing


    To address this issue, SafelyShare offers an approach that is best described as a hybrid model of confidential computing. To compensate for services that don't operate within secure enclaves, this methodology introduces the idea of 'witness execution.' In this scenario, the user places trust in the providers' guarantee of their competency and safe data handling. It's a kind of tacit agreement between the user and the remote service provider, making the confidential computing more feasible in the real world scenarios.


    This hybrid approach redefines the secure sharing paradigm in a world that's continuously evolving. With its elastic foundation, SafelyShare incorporates a profound understanding of the changing security parameters, making confidential computing adaptable and responsive to changing demands and realities.


     Conclusion: Revolutionizing Secure Data Sharing


    In essence, SafelyShare is the leading forerunner in the journey to making sensitive data sharing secure, efficient, and feasible. Navigating around traditional hurdles, it integrates hybrid confidential computing into its framework, achieving a unique blend of trust and practicality. The innovative approach of integrating witnessed computing into the process blurs the lines between full and partial trust, making data security more achievable and delivering a promising narrative for the future of data sharing and security.

    #176 Zero Trust Shared Data

    #176 Zero Trust Shared Data

    In this episode, Darren interviews Shammim Naqvi, the CEO and founder of SafelyShare, about managing and securing data in shared and collaborative environments using the zero-trust data model.

    # Shamim Naqvi: Pioneering Data Privacy in the Age of Zero Trust Security


    In the ever-evolving world of computer science, addressing the issue of data privacy forms a daunting yet essential task. As digital transformations engulf every sphere of life, an increasing onus lies on preserving and protecting the user's data. One expert battling this computational challenge head-on is Shamim Naqvi, a veteran technologist and the driving force behind the innovative startup, Safely Shared.


     Prioritizing User Control in Data Privacy


    In a universe swarming with security measures focusing mainly on encrypting network data or safeguarding ports, Naqvi’s approach stands out as he prioritizes how data is utilized during computation. It's seldom about erecting impregnable walls, but aligning more towards enabling the users to dictate the use of their data.


    Naqvi's trailblazing approach seeks to solve a previously unsolved conundrum: stopping unauthorized usage of user data. This issue is often a surreptitious byproduct of the trade between users and service providers—exchange of data for services. Over time, however, this data tends to stray into territories not intended by the users, triggering severe privacy concerns.


     Zero-Knowledge Proofs: A Gamechanger for Data Privacy


    In his quest for achieving data privacy, Naqvi gives special attention to a mathematical concept—zero-knowledge proofs—that promotes data verification without acquiring any excess knowledge from the verification process. Despite offering an impeccable solution, the multifaceted mathematics behind zero-knowledge proofs pose a significant challenge for their efficient implementation in real-world applications.


     Data Security in Naqvi's Startup Project: Safely Shared


    Naqvi's cutting-edge firm, Safely Shared, is making giant strides in striking a balance between user convenience and data privacy. Its motto, “share but not lose control,” is a testament to its mission to foster a secure computing environment that leaves no data unprotected.


     Valuing Data Privacy in A Zero Trust Security Age


    In this modern era, where trust and secrecy are paramount, the idea of user's control over their data is widely welcomed. It's a thrilling challenge—making data privacy more accessible—and at the helm of Safely Shared, Shamim Naqvi is breaking new grounds with his innovative approaches to secure this privacy.

    December 3, 2023

    December 3, 2023

    Please check out the latest news in the world of Digital Transformation for the week of December 3, 2023. You'll find a variety of interesting stories related to edge computing, data management, and artificial intelligence. This week, AWS and Siemens have collaborated to simplify edge computing, while Intel is helping to improve cloud-based data management. Additionally, several governments are developing new strategies for AI. Blog: https://embracingdigital.org/briefs/edw-44/en/episode.html Video: 

    3 Dicembre 2023

    3 Dicembre 2023

    Fai controlla le ultime notizie nel mondo della Trasformazione Digitale per la settimana del 3 dicembre 2023. Troverai una varietà di storie interessanti legate al edge computing, alla gestione dei dati e all'intelligenza artificiale. Questa settimana, AWS e Siemens hanno collaborato per semplificare il edge computing, mentre Intel sta aiutando a migliorare la gestione dei dati basata su cloud. Inoltre, diversi governi stanno sviluppando nuove strategie per l'IA. Blog: https://embracingdigital.org/briefs/edw-44/it/episode.html Video: 

    #175 Zero Trust with Operational Technology

    #175 Zero Trust with Operational Technology

    In this episode Darren interviews the CEO of Founder of Veridify Louis Parks. They discuss the unique problems with Operational technology networks that control critical infrastructure, due to legacy complexity, accessibility vulnerabilities, and lack of visibility.

     Introduction


    Operational technology (OT) networks power our critical infrastructure like energy, transportation, and manufacturing systems. These OT networks were designed for safety and reliability without much thought about cybersecurity. However, with increased connectivity, OT networks face growing threats that could have major impacts on our physical world. This article discusses some of the unique challenges and solutions for securing OT environments.


     Legacy Complexity


    OT networks accumulate technologies over decades of operations, leading to complex environments with older unsupported devices and proprietary protocols. Trying to retrofit security is difficult without impacting critical functions. Solutions focus on non-intrusive monitoring of network traffic and encrypting data streams while maintaining existing systems. The priority is keeping systems running safely rather than taking systems offline to investigate threats.


    In addition, OT networks often have a mix of legacy devices using older proprietary protocols that predate common IT technologies like TCP/IP networking. Securing these heterogeneous environments requires protecting both modern IP-connected devices as well as older technology using obscure protocols. Emerging solutions aim to encrypt network traffic at the packet level, creating encrypted tunnels even over non-IP networks to block tampering.


     Physical Access Vulnerabilities


    Many OT devices are distributed in publicly accessible areas like smart city infrastructure or manufacturing plants. This makes them vulnerable to physical tampering by malicious actors trying to access networks. Solutions aim to encrypt network traffic from end to end, blocking man-in-the-middle attacks even if someone gains physical access to infrastructure.


    Demonstrating these physical access threats, solutions show how devices secretly plugged into infrastructure switches are unable to control other devices or decrypt meaningful data from the network when encryption is enabled. This foils common attacks by insiders with physical access trying to spy on or disrupt operations.


     Lack of Visibility


    OT networks often lack visibility into assets, vulnerabilities, and threats compared to IT environments. Simply gaining an accurate asset inventory and monitoring network activity can improve security postures. Emerging solutions apply IT security best practices like zero trust segmentation to OT environments through centralized policy management rather than trying to secure each individual asset.


    In addition to lack of visibility, OT networks transmit data without protections common in IT environments like encryption. Unencrypted plain text protocols allow anyone with network access to spy on sensitive operational data. New solutions not only selectively encrypt sensitive data streams but also establish secure tunnels between authorized devices rather than openly transmitting data.


     Conclusion


    Securing OT environments raises unique challenges but solutions are emerging to balance improved cybersecurity with operational reliability. Non-intrusive monitoring, data encryption, and centralized policy enforcement allow incremental hardening of OT networks against escalating threats. There is still a long way to go but progress is being made.