@sciencedaily.com
//
References:
The Quantum Insider
, www.sciencedaily.com
,
Recent advancements in quantum computing research have yielded promising results. Researchers at the University of the Witwatersrand in Johannesburg, along with collaborators from Huzhou University in China, have discovered a method to shield quantum information from environmental disruptions, potentially leading to more reliable quantum technologies. This breakthrough involves manipulating quantum wave functions to preserve quantum information, which could enhance medical imaging, improve AI diagnostics, and strengthen data security by providing ultra-secure communication.
UK startup Phasecraft has announced a new algorithm, THRIFT, that improves the ability of quantum computers to model new materials and chemicals by a factor of 10. By optimizing quantum simulation, THRIFT enables scientists to model new materials and chemicals faster and more accurately, even on today’s slower machines. Furthermore, Oxford researchers have demonstrated a 25-nanosecond controlled-Z gate with 99.8% fidelity, combining high speed and accuracy in a simplified superconducting circuit. This achievement advances fault-tolerant quantum computing by improving raw gate performance without relying heavily on error correction or added hardware. Recommended read:
References :
Mike Watts@computational-intelligence.blogspot.com
//
References:
computational-intelligence.blo
, computational-intelligence.blo
Recent developments highlight advancements in quantum computing, artificial intelligence, and cryptography. Classiq Technologies, in collaboration with Sumitomo Corporation and Mizuho-DL Financial Technology, achieved up to 95% compression of quantum circuits for Monte Carlo simulations used in financial risk analysis. This project explored the use of Classiq’s technology to generate more efficient quantum circuits for a novel quantum Monte Carlo simulation algorithm incorporating pseudo-random numbers proposed by Mizuho-DL FT, evaluating the feasibility of implementing quantum algorithms in financial applications.
Oxford researchers demonstrated a fast, 99.8% fidelity two-qubit gate using a simplified circuit design, achieving this using a modified coaxmon circuit architecture. Also, a collaborative team from JPMorganChase, Quantinuum, Argonne National Laboratory, Oak Ridge National Laboratory, and the University of Texas at Austin demonstrated a certified randomness protocol using a 56-qubit Quantinuum System Model H2 trapped-ion quantum computer. This is a major milestone for real-world quantum applications, with the certified randomness validated using over 1.1 exaflops of classical computing power, confirming the quantum system’s ability to generate entropy beyond classical reach. The 2025 IEEE International Conference on Quantum Artificial Intelligence will be held in Naples, Italy, from November 2-5, 2025, with a paper submission deadline of May 15, 2025. Vanderbilt University will host a series of workshops devoted to Groups in Geometry, Analysis and Logic starting May 28, 2025. Recommended read:
References :
Terence Tao@What's new
//
References:
beuke.org
, What's new
Terence Tao has recently uploaded a paper to the arXiv titled "Decomposing a factorial into large factors." The paper explores a mathematical quantity, denoted as t(N), which represents the largest value such that N! can be factorized into t(N) factors, with each factor being at least N. This concept, initially introduced by Erdös, delves into how equitably a factorial can be split into its constituent factors.
Erdös initially conjectured that an upper bound on t(N) was asymptotically sharp, implying that factorials could be split into factors of nearly uniform size for large N. However, a purported proof by Erdös, Selfridge, and Straus was lost, leading to the assertion becoming a conjecture. The paper establishes bounds on t(N), recovering a previously lost result. Further conjectures were made by Guy and Selfridge, exploring whether relationships held true for all values of N. On March 30th, mathematical enthusiasts celebrated facts related to the number 89. Eighty-nine is a Fibonacci prime, and patterns emerge when finding it's reciprocal. Also, the number 89 can be obtained by a summation of the first 5 integers to the power of the first 5 Fibonacci numbers. 89 is also related to Armstrong numbers, which are numbers that are the sum of their digits raised to the number of digits in the number. Recommended read:
References :
Tom Bridges@blogs.surrey.ac.uk
//
References:
blogs.surrey.ac.uk
, The Quantum Insider
,
Recent breakthroughs are pushing the boundaries of quantum theory and quantum randomness, paving the way for commercial applications and more reliable quantum technologies. A paper by Dorje Brody, along with collaborators Eva-Maria Graefe and Rishindra Melanathuru, has been published in Physical Review Letters, exploring decoherence resulting from phase-space measurements. Their work addresses the question of decoherence resulting from a monitoring of position and momentum, i.e., a phase-space measurement, by the environment.
Researchers have also made strides in protecting quantum information from environmental disruptions, offering hope for more stable quantum computers and networks. Scientists have demonstrated how certain quantum states can maintain their critical information even when disturbed by environmental noise. This could lead to more reliable quantum technology, enhanced medical imaging techniques, improved AI-driven diagnostics, and stronger data security. Simultaneously, a joint research team consisting of members from JPMorgan Chase, Quantinuum, multiple national labs, and UT Austin, has achieved certified quantum randomness, turning once theoretical experiments into first commercial applications for quantum computing. The team demonstrated a certified randomness protocol using Quantinuum's 56-qubit H2 trapped-ion system, showcasing a quantum computer's ability to generate entropy beyond classical reach. Furthermore, the high cost of quantum randomness is dropping due to advancements in pseudorandomness techniques, which may open new doors for quantum computing and cryptography research. Recommended read:
References :
Webb Wright@Quanta Magazine
//
References:
The Quantum Insider
, Quanta Magazine
,
Researchers are making significant strides in reducing the costs associated with quantum randomness, a crucial element for cryptography and simulations. Traditionally, obtaining true quantum randomness has been complex and expensive. However, the exploration of "pseudorandomness" offers a practical alternative, allowing researchers to utilize computational algorithms that mimic randomness, thus sidestepping the high costs of pure quantum randomness. This development broadens the accessibility of randomness, enabling researchers to pursue new scientific investigations.
The team from JPMorganChase, Quantinuum, multiple national labs, and UT Austin demonstrated a certified quantum randomness protocol. They showcased the first successful demonstration of a quantum computing method to generate certified randomness. Using a 56-qubit quantum machine, they output more randomness than they initially put in. What makes this truly remarkable is that this feat is considered impossible for even the most powerful classical supercomputers. This groundbreaking achievement could open new doors for quantum computing and cryptography research. Recommended read:
References :
Ellie Ramirez-Camara@Data Phoenix
//
The ARC Prize Foundation has launched ARC-AGI-2, a new AI benchmark designed to challenge current foundation models and track progress towards artificial general intelligence (AGI). Building on the original ARC benchmark, ARC-AGI-2 blocks brute force techniques and introduces new tasks intended for next-generation AI systems. The goal is to evaluate real progress toward AGI by requiring models to reason abstractly, generalize from few examples, and apply knowledge in new contexts, tasks that are simple for humans but difficult for machines.
The Foundation has also announced the ARC Prize 2025, a competition running from March 26 to November 3, with a grand prize of $700,000 for a solution achieving an 85% score on the ARC-AGI-2 benchmark's private evaluation dataset. Early testing results show that even OpenAI's top models experienced a significant performance drop, with o3 falling from 75% to approximately 4% on ARC-AGI-2. This highlights how the new benchmark significantly raises the bar for AI tests, measuring general fluid intelligence rather than memorized skills. Recommended read:
References :
Matt Marshall@AI News | VentureBeat
//
References:
Microsoft Security Blog
, www.zdnet.com
Microsoft is enhancing its Copilot Studio platform with AI-driven improvements, introducing deep reasoning capabilities that enable agents to tackle intricate problems through methodical thinking and combining AI flexibility with deterministic business process automation. The company has also unveiled specialized deep reasoning agents for Microsoft 365 Copilot, named Researcher and Analyst, to help users achieve tasks more efficiently. These agents are designed to function like personal data scientists, processing diverse data sources and generating insights through code execution and visualization.
Microsoft's focus includes securing AI and using it to bolster security measures, as demonstrated by the upcoming Microsoft Security Copilot agents and new security features. Microsoft aims to provide an AI-first, end-to-end security platform that helps organizations secure their future, one example being the AI agents designed to autonomously assist with phishing, data security, and identity management. The Security Copilot tool will automate routine tasks, allowing IT and security staff to focus on more complex issues, aiding in defense against cyberattacks. Recommended read:
References :
Amir Najmi@unofficialgoogledatascience.com
//
Data scientists and statisticians are continuously exploring methods to refine data analysis and modeling. A recent blog post from Google details a project focused on quantifying the statistical skills necessary for data scientists within their organization, aiming to clarify job descriptions and address ambiguities in assessing practical data science abilities. The authors, David Mease and Amir Najmi, leveraged their extensive experience conducting over 600 interviews at Google to identify crucial statistical expertise required for the "Data Scientist - Research" role.
Statistical testing remains a cornerstone of data analysis, guiding analysts in transforming raw numbers into actionable insights. One must also keep in mind bias-variance tradeoff and how to choose the right statistical test to ensure the validity of analyses. These tools are critical for both traditional statistical roles and the evolving field of AI/ML, where responsible practices are paramount, as highlighted in discussions about the relevance of statistical controversies to ethical AI/ML development at an AI ethics conference on March 8. Recommended read:
References :
Stephen Ornes@Quanta Magazine
//
References:
Quanta Magazine
, medium.com
A novel quantum algorithm has demonstrated a speedup over classical computers for a significant class of optimization problems, according to a recent report. This breakthrough could represent a major advancement in harnessing the potential of quantum computers, which have long promised faster solutions to complex computational challenges. The new algorithm, known as decoded quantum interferometry (DQI), outperforms all known classical algorithms in finding good solutions to a wide range of optimization problems, which involve searching for the best possible solution from a vast number of choices.
Classical researchers have been struggling to keep up with this quantum advancement. Reports of quantum algorithms often spark excitement, partly because they can offer new perspectives on difficult problems. The DQI algorithm is considered a "breakthrough in quantum algorithms" by Gil Kalai, a mathematician at Reichman University. While quantum computers have generated considerable buzz, it has been challenging to identify specific problems where they can significantly outperform classical machines. This new algorithm demonstrates the potential for quantum computers to excel in optimization tasks, a development that could have broad implications across various fields. Recommended read:
References :
Maximilian Schreiner@THE DECODER
//
Google's Gemini 2.5 Pro is making waves as a top-tier reasoning model, marking a leap forward in Google's AI capabilities. Released recently, it's already garnering attention from enterprise technical decision-makers, especially those who have traditionally relied on OpenAI or Claude for production-grade reasoning. Early experiments, benchmark data, and developer reactions suggest Gemini 2.5 Pro is worth serious consideration.
Gemini 2.5 Pro distinguishes itself with its transparent, structured reasoning. Google's step-by-step training approach results in a structured chain of thought that provides clarity. The model presents ideas in numbered steps, with sub-bullets and internal logic that's remarkably coherent and transparent. This breakthrough offers greater trust and steerability, enabling enterprise users to validate, correct, or redirect the model with more confidence when evaluating output for critical tasks. Recommended read:
References :
Tom Bridges@blogs.surrey.ac.uk
//
Recent activity in the mathematical community has highlighted the enduring fascination with mathematical constants and visual representations of mathematical concepts. A blog post on March 23, 2025, discussed a remarkably accurate approximation for pi, noting that π ≈ 3 log(640320) / √163 is exact within the limits of floating-point arithmetic, achieving accuracy to 15 decimal places. This discovery builds upon historical efforts to approximate pi, from ancient Babylonian and Egyptian calculations to Archimedes' method of exhaustion and the achievements of Chinese mathematicians like Liu Hui and Zu Chongzhi.
Visual insights in mathematics continue to be explored. A blog called Visual Insight shares striking images that help explain topics in mathematics. The creator gave a talk about it at the Illustrating Math Seminar. The blog features images created by people such as Refurio Anachro, Greg Egan, and Roice Nelson, and individual articles are available on the AMS website. Recommended read:
References :
Denise Gaskins@denisegaskins.com
//
Recent studies and educational resources are focusing on enhancing math education through innovative approaches. Denise Gaskins' "Let's Play Math" blog offers resources for families to learn and enjoy math together, including playful math books and internet resources suitable for various age groups. Math journaling and games have been highlighted as effective tools to engage students, promote problem-solving skills, and foster a richer mathematical mindset.
Numerous games and activities can make learning fun. For instance, "Make a Square" is a game that builds 2-D visualization skills and strategic thinking. Quick number games that can be played anywhere. The divisibility rules for numbers, particularly divisibility by 2, are being emphasized to help students easily identify even and odd numbers. A megastudy also revealed that behaviorally informed email messages improved students' math progress, demonstrating how simple interventions can positively impact learning outcomes. Recommended read:
References :
Yvonne Smit@Qusoft
//
References:
Qusoft
Koen Groenland's book, "Introduction to Quantum Computing for Business," is gaining attention as a key resource for guiding companies on leveraging quantum advancements. As the Dutch quantum ecosystem expands, experts like Groenland are playing a vital role in making quantum knowledge accessible to the business world. The book aims to demystify this technology for business professionals without a technical background, focusing on the capabilities and applications of quantum computers rather than the underlying technical details. Groenland hopes the book will become a standard work for anyone starting a quantum journey, emphasizing the importance of understanding quantum algorithms for business value.
Classiq Technologies, in collaboration with Sumitomo Corporation and Mizuho-DL Financial Technology, achieved significant compression of quantum circuits for Monte Carlo simulations used in financial risk analysis. The study compared traditional and pseudo-random number-based quantum Monte Carlo methods, optimizing circuit depth and qubit usage using Classiq’s high-level quantum design platform, Qmod. The results showed efficient circuit compression is possible without compromising accuracy, supporting the feasibility of scalable, noise-tolerant quantum applications in financial risk management. The Open Source Initiative (OSI) and Apereo Foundation have jointly responded to the White House Office of Science & Technology Policy's (OSTP) request for information on an AI Action Plan. Their comment emphasizes the benefits of Open Source and positions the Open Source community as a valuable resource for policymakers. The OSI highlighted its history of stewarding the Open Source Definition and its recent work in co-developing the Open Source AI Definition (OSAID), recommending that the White House rely on the OSAID as a foundational piece of any future AI Action Plan. Recommended read:
References :
@The Cryptography Caffe? ?
//
The UK's National Cyber Security Centre (NCSC) has released a roadmap for transitioning to post-quantum cryptography (PQC), establishing key dates for organizations to assess risks, define strategies, and fully transition by 2035. This initiative aims to mitigate the future threat of quantum computers, which could potentially break today's widely used encryption methods. The NCSC’s guidance recognizes that PQC migration is a complex and lengthy process requiring significant planning and investment.
By 2028, organizations are expected to complete a discovery phase, identifying systems and services reliant on cryptography that need upgrades, and draft a migration plan. High-priority migration activities should be completed by 2031, with infrastructure prepared for a full transition. The NCSC emphasizes that these steps are essential for addressing quantum threats and improving overall cyber resilience. Ali El Kaafarani, CEO of PQShield, noted that these timelines give clear instructions to protect the UK’s digital future. Researchers have also introduced ZKPyTorch, a compiler that integrates ML frameworks with ZKP engines to simplify the development of zero-knowledge machine learning (ZKML). ZKPyTorch automates the translation of ML operations into optimized ZKP circuits and improves proof generation efficiency. Through case studies, ZKPyTorch successfully converted VGG-16 and Llama-3 models into ZKP-compatible circuits. Recommended read:
References :
Editor-In-Chief, BitDegree@bitdegree.org
//
A new, fully AI-driven weather prediction system called Aardvark Weather is making waves in the field. Developed through an international collaboration including researchers from the University of Cambridge, Alan Turing Institute, Microsoft Research, and the European Centre for Medium-Range Weather Forecasts (ECMWF), Aardvark Weather uses a deep learning architecture to process observational data and generate high-resolution forecasts. The model is designed to ingest data directly from observational sources, such as weather stations and satellites.
This innovative system stands out because it can run on a single desktop computer, generating forecasts tens of times faster than traditional systems and requiring thousands of times less computing power. While traditional weather forecasting relies on Numerical Weather Prediction (NWP) models that use physics-based equations and vast computational resources, Aardvark Weather replaces all stages of this process with a streamlined machine learning model. According to researchers, Aardvark Weather can generate a forecast in seconds or minutes, using only about 10% of the weather data required by current forecasting systems. Recommended read:
References :
staff@insidehpc.com
//
Nvidia CEO Jensen Huang has publicly walked back previous comments made in January, where he expressed skepticism regarding the timeline for quantum computers becoming practically useful. Huang apologized for his earlier statements, which caused a drop in stock prices for quantum computing companies. During the recent Nvidia GTC 2025 conference in San Jose, Huang admitted his misjudgment and highlighted ongoing advancements in the field, attributing his initial doubts to his background in traditional computer systems development. He expressed surprise that his comments had such a significant impact on the market, joking about the public listing of quantum computing firms.
SEEQC and Nvidia announced a significant breakthrough at the conference, demonstrating a fully digital quantum-classical interface protocol between a Quantum Processing Unit (QPU) and a Graphics Processing Unit (GPU). This interface is designed to facilitate ultra-low latency and bandwidth-efficient quantum error correction. Furthermore, Nvidia is enhancing its support for quantum research with the CUDA-Q platform, designed to streamline the development of hybrid, accelerated quantum supercomputers. CUDA-Q performance can now be pushed further than ever with v0.10 support for the NVIDIA GB200 NVL72. Recommended read:
References :
Charlie Wood@Quanta Magazine
//
Recent data from the Dark Energy Spectroscopic Instrument (DESI) suggests that dark energy, the mysterious force driving the accelerating expansion of the universe, may be weakening over time. This challenges the standard model of cosmology, which assumes dark energy has a constant density and pressure. Researchers, including Seshadri Nadathur from the DESI collaboration, have analyzed significantly more data than in previous studies, strengthening the conclusion that the engine driving cosmic expansion might be sputtering.
The findings are also supported by evidence from the Dark Energy Survey (DES), which also observed a vast expanse of the cosmos and reported indications of varying dark energy. Miguel Zumalacárregui notes that Euclid's capabilities could better determine the universe's expansion rate through gravitational-wave observations. If confirmed, this would rewrite our understanding of the universe's fate, potentially leading to alternative scenarios beyond the current model of endless expansion and eventual cosmic emptiness. Recommended read:
References :
Unknown (noreply@blogger.com)@Pat'sBlog
//
References:
Pat'sBlog
Recent discussions have highlighted the diverse applications and historical roots of mathematics. A blog post explored the history of mathematical terms such as billion, trillion, and others, tracing their origins back to figures like Nicholas Chuquet, a French physician from the 15th century. The evolution of these terms and their varying definitions across different countries demonstrate the rich history and changing conventions within mathematical nomenclature. This information has recently resurfaced in a post from earlier this year.
Alongside the history of math, practical math applications are being discussed. For example, recent word problems are now available that focuses on division suitable for fourth-grade students. The step-by-step solutions for problems involving dividing quantities among groups can help students improve their comprehension of division and problem solving. Mathematics continues to be the basis for many algorithms in a variety of modern technological applications and is not widely recognized as a science. Recommended read:
References :
Matt Swayne@The Quantum Insider
//
D-Wave Quantum Inc. has made a splash by claiming its Advantage2 annealing quantum computer achieved quantum supremacy in complex materials simulations, publishing their study in the journal Science. The company states that its system can perform simulations in minutes that would take the Frontier supercomputer nearly a million years and consume more than the world’s annual electricity consumption. According to D-Wave CEO Alan Baratz, this achievement validates quantum annealing's practical advantage and represents a major milestone in quantum computational supremacy and materials discovery.
However, D-Wave's claim has faced criticism, with researchers suggesting that classical algorithms can rival or even exceed quantum methods in these simulations. Some researchers say that they performed similar calculations on a normal laptop in just two hours. Concerns have been raised about the real-world applicability and practical benefits of D-Wave's quantum supremacy claims in computational tasks. Despite the criticisms, D-Wave is standing by the claims from the study. Recommended read:
References :
|
Blogs
|