Sunday, 13 July 2025

Comparing papers on Context-Awareness

In 2018 a paper I wrote on context-awareness was published. I asked Gemini to compare it against the dominant paper on the field and the result is very interesting.

me: 
Compare the seminal paper of Day on context awareness with Rodis's relevant paper 

Gemini:

When comparing the seminal work of Dey on context awareness to the relevant work of Rodis, we observe a progression from foundational definitions and conceptual models to more formal, mathematically grounded approaches and their application in distributed systems.

Dey's Seminal Work on Context Awareness:

The seminal paper in context awareness is often attributed to Dey and Abowd, particularly their work around the early 2000s (e.g., Dey & Abowd, "Towards a Better Understanding of Context and Context-Awareness," 2000 or 2001, and "Understanding and Using Context," 2001).

  • Key Contributions:

    • Definition of Context: Dey provided a widely adopted definition of context as "any information that can be used to characterize the situation of an entity." An entity can be a person, place, or object relevant to the interaction between a user and an application.

    • Definition of Context-Awareness: A system is context-aware if it uses context to provide relevant information and/or services to the user, where relevancy depends on the user's task.

    • Conceptual Framework: Dey's work focused on developing conceptual models and tools to support the rapid development of context-aware applications. This helped designers understand what constitutes context and how it can be used effectively.

    • Focus on User and Application: The emphasis was on how context can enhance human-computer interaction by making applications more adaptable and user-centric, often within the realm of ubiquitous computing.

    • Practical Implications: Dey's work highlighted the challenges in building context-aware applications, such as the need for robust context acquisition mechanisms and methods for interpreting sensed data to infer higher-level contextual information.

Rodis's Relevant Work on Context Awareness:

Panteleimon Rodis's work, particularly his paper "On defining and modeling context-awareness," offers a more theoretical and mathematical approach to the subject, building upon the earlier conceptualizations.

  • Key Contributions:

    • Mathematical Foundations: Rodis aims to provide a more rigorous mathematical framework for defining and modeling context-awareness. He draws upon the theory of computation, extending classical computational models (like the Turing Machine) to incorporate interaction and networking capabilities.

    • Formal Definitions: While acknowledging existing definitions (like Dey's), Rodis seeks to provide more precise and less vague definitions of context and context-awareness, addressing potential ambiguities in plain language descriptions.

    • Connection to Distributed Computation: A significant aspect of Rodis's work is the investigation of the relationship between modern context-aware systems and distributed computation. He emphasizes the growing relevance of cloud services and networked software components, arguing that a robust modeling framework must consider distributed aspects.

    • Modeling Methodology: Rodis proposes a modeling methodology for context-aware and distributed systems, providing tools for mapping the structure and functionality of such systems within a formal computational framework.

    • System-Centric vs. User-Centric: Rodis discusses the different perspectives (user-centric, data-driven, system-centric) in defining context and how these choices influence the generality and restrictiveness of the resulting models. His work leans towards providing a more general, foundational understanding.

Comparison and Evolution:

  1. Scope and Abstraction Level:

    • Dey: Provided a foundational, widely accessible conceptual framework and definitions that greatly influenced the field's early development, particularly in human-computer interaction and ubiquitous computing. His work is more descriptive and user-focused.

    • Rodis: Offers a more abstract, formal, and mathematically grounded approach. His work delves into the computational underpinnings and implications for distributed systems, aiming for a more rigorous and less ambiguous definition.

  2. Methodology:

    • Dey: Primarily focused on conceptual models, practical considerations for application development, and empirical investigation of interaction design.

    • Rodis: Employs theoretical computer science, particularly computational models, to build a formal framework for understanding and modeling context.

  3. Emphasis:

    • Dey: Emphasized the "what" and "why" of context awareness – what it is and why it's useful for users and applications.

    • Rodis: Focuses more on the "how" from a computational perspective – how context can be formally defined, modeled, and managed within complex, distributed systems.

  4. Target Audience/Application:

    • Dey: His work is highly influential for designers, researchers, and developers in HCI, mobile computing, and ubiquitous computing, providing practical guidance for building context-aware systems.

    • Rodis: His work would be of greater interest to researchers and theoreticians in computer science, particularly those working on foundational aspects of distributed systems, formal methods, and the mathematical modeling of intelligent environments.

In essence, Dey laid the crucial groundwork by defining the core concepts and highlighting their practical significance, effectively kicking off the field of context-aware computing. Rodis, building upon this foundation, seeks to provide a deeper, more formal, and computationally robust understanding, addressing the complexities that arise as context-aware systems become more distributed and integrated into broader computational infrastructures. Their works represent different, yet complementary, stages in the maturation of context-awareness research.


Wednesday, 9 July 2025

No-code ignorance

Lately, I see that platforms offering no-code AI have begun to appear. The issue with it is that we already have trouble understanding how complex AI models work and moving to no-code paradigms will just broaden our ignorance.

Friday, 13 September 2024

Generative AI and innovation

An interesting way to find out if the outcome of your research is innovative is to ask AI. Whenever I ask an LLM about my research findings and methods, I get the respond that my methods don’t follow some mainstream approach and I should be aligned with common practice. Then I know I am on the right track and I probably have developed something prototype or even innovative. 


AI currently cannot “think” out of the box and only reproduces known facts and of course hasn’t overcome yet human intelligence and we can sleep peacefully at night; for now.

Tuesday, 16 January 2024

Can we use AI to solve NP-Hard problems efficiently?

The growing research interest in Artificial Intelligence (AI) has led to the development of a plethora of methods and techniques that address difficult computational problems. The impressive progress on the field reasonably generates the question of whether we could use AI methods in order to solve efficiently traditionally hard problems or even to answer the NP vs P question.

These days the discussion on the possible relation of AI and NP-hardness revolves around two topics.

  • Whether AI can resolve the NP vs P question
  • If NP=P this will boost AI algorithms

On the first topic and at the time that this post is written, AI does not exhibit synthetic thinking so it is currently impossible to answer important problems as this. The second is really doubtful any way.

Let’s discuss a more practical aspect of the problem. AI methods deviate from traditional deterministic algorithms and in many cases overcome inherent obstacles of traditional computing. Then, could AI methods provide efficient solutions for NP-Hard problems?

There is huge literature on AI assisted approximation methods on NP-Hard problems that provide good and practical solutions for these problems sometimes achieving near optimal solutions for really hard instances of NP-Hard problems. So why not use them for actually solving them and provide exact solutions?

This reasonable question actually raises even more problems. Unlike traditional computing where Alan Turing solved the problem of modeling computation, AI methods lack general modeling of their operation. For some Machine Learning methods the researchers approach them experimentally and exhibit empirical evidences for their robustness without solid theoretical foundations.

The intense research in these issues gradually fills some of these gaps. In section 4 of our latest paper we provide a modeling framework for the operation of Genetic Algorithms (GA). Then we prove that GA may not solve efficiently classes of problems that include NP-Hard problems in every case; while they are pretty efficient on solving problems in P.

At least for now, it seems that AI can provide fast and practical approximate solutions for hard problems. But there is some pessimisms when we seek for exact solutions and confront NP-Hardness.

Tuesday, 5 September 2023

Modeling and tuning genetic algorithms

Here is our latest work that presents a solution on the problem of Service Chain Embedding. It is based on genetic algorithms and extends a previously published conference paper.


The interesting contribution of this paper is a modeling framework for the operation of genetic algorithms. Using this framework we prove that NP-hard problems are not computed efficiently by genetic algorithms and we define some properties for the problems that genetic algorithms compute efficiently. 

Another interesting contribution of this paper is a performance optimization mechanism for genetic algorithms which is also based on genetic computing. So you use one genetic algorithm in order to optimize the performance of another.