Software intelligence

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Software Intelligence is insight into complex software structure produced by software designed to analyze database structure, software framework and source code to better understand and control complex software systems in Information Technology environments.[1] Similarly to Business Intelligence (BI), Software Intelligence is produced by a set of software tools and techniques for the mining of data and software inner-structure. End results are information used by business and software stakeholders to make informed decisions, communicate about software health, measure efficiency of software development organizations, and prevent software catastrophes.[2]


Because of the complexity and wide range of components implied in software, Software intelligence is made up of an increasing number of components including:

  • Graphical visualization of the inner structure of the software product or application considered[3]
  • Code analyzer to serve as an information basis for other components
  • Dependency representation, from data acquisition (automated and real time data capture, end user entries), up to data storage
  • Navigation capabilities within components and impact analysis features
  • Grades or scores of the structural and software quality aligned with industry standard like OMG, CISQ or SEI
  • List of architectural and coding violations against standardized best practices[4]
  • Reporting structural alert
  • Industry references and benchmarking allowing comparisons between outputs of analysis and industry standards


Software Intelligence has been used by Kirk Paul Lafler, an American engineer, entrepreneur and consultant and founder of Software Intelligence Corporation in 1979. At that time, this was mainly related to SAS activities, in which he has been an expert since 1979.

In 1983, Victor R. Basili detailed a methodology for collecting valid software engineering data relating to software engineering, evaluation of software development and variations with initial goals in November 1984 in IEEE Transactions on Software Engineering journal.[5]

In 1996, CAST Software, a public company based in France, launched the first Software Intelligence platform addressing systems made of different stacks of technologies.

Lately, in 2010, Ahmed E. Hassan and Tao Xie defined Software Intelligence as a "practice offering software practitioners (not just developer) up-to-date and pertinent information to support their daily decision-making processes and Software Intelligence should support decision-making processes throughout the lifetime of a software system". They go on by defining Software Intelligence as a "strong impact on modern software practice" for the upcoming decades.[6]

User Aspect[edit]

Some considerations must be made in order to successfully integrate the usage of software intelligence systems in a company. Ultimately the Software Intelligence system must be accepted and utilized by the users in order for it to add value to the organization. If the system does not add value to the users mission, they simply don't use it as stated by M. Storey in 2003.[7]

At the code level and system representation, Software Intelligence systems must provide different level of abstractions: an abstract view for designing, explaining and documenting and a detailed view for understanding and analyzing the software system.[8]

At the governance level, the user acceptance for Software Intelligence covers different areas related to the inner functioning of the system as well as the output of the system. It encompasses this requirements:

  • Comprehensive: missing information may lead to wrong or inappropriate decision, as well as it is a factor influencing the user acceptance of a system.[9]
  • Accurate: accuracy depends on how the data is collected to ensure fair and indisputable opinion and judgement.[10]
  • Precise: precision is usually judged by comparing several measurements from the same or different sources.[11]
  • Scalable: lack of scalability in software industry is a critical factor leading to failure.[12]
  • Credible: outputs must be trusted and believed.
  • Deploy-able and usable


Software intelligence has many applications in all businesses relating to software environment, whether it is software for professionals, individuals, or embedded software.

  • Providing analytics about the software itself or stakeholders involved in the development of the software, e.g. productivity measurement to inform business and IT leaders about progress towards business goals.[13]
  • Assessment and Benchmarking to help business and IT leaders to make informed, fact based decision about software.[14]
  • Measuring against industry standards to diagnose structural flaws in an IT environment.[15]
  • Compliance validation regarding security, specific regulations or technical matters.
  • Uniform documentation on all inner components of the software.[16]


The Software Intelligence is a high level discipline and has been gradually growing covering applications listed above. There are several markets driving the need for it:

  • Application Portfolio Analysis (APA) aiming at improving the enterprise performance[17][18]
  • Software Assessment for producing software KPI[19] and improve quality and productivity
  • Software security and resiliency measures and validation
  • Software evolution or legacy modernization, for which blueprinting the software systems are needed nor tools improving and facilitating modifications

There are numerous Software Intelligence vendors:

  • CAST Software claims to be the pioneer, category leader and main vendor and is referenced by multiple analysts and advisory firm within all sub-market.[20]
  • Dynatrace offers a Software intelligence solution for hybrid and multi-cloud environments mainly for the US market.[21]
  • IBM – IBM Application Discovery and Delivery Intelligence[22] is an analytical platform for COBOL systems hosted on IBM® z Systems®. It uses cognitive technologies to analyze mainframe applications and understand interdependencies and impacts of change. It analyzes application logic, impact of change, root cause analysis, project health visibility. It includes testing features, cognitive user guidance, and risk anticipation.
  • MIA Software provides software systems blueprints and software quality features and is acting mainly on the French market[23]
  • Magnify: a tool for software visualization based on a static code analysis implemented by University of Warsaw in Poland[24]
  • Micro Focus – Relativity[25]: For COBOL applications, enables developers to create mappings between files, records, fields and database tables, views and columns using graphical design tools. This technology presents COBOL data as a relational data source, enabling business users and developers to take advantage of reporting tools to enhance the data processing capabilities of COBOL applications. It uses ODBC or JDBC connectivity. Relativity makes COBOL data contained within traditional COBOL data files, available to virtually any analytics or reporting tool, such as Microsoft Excel.
  • Tableau focuses on processing of data via Software Intelligence. The provided solutions involve offerings for healthcare, education, and government topics.


  1. ^ Dąbrowski R. (2012) On Architecture Warehouses and Software Intelligence. In: Kim T., Lee Y., Fang W. (eds) Future Generation Information Technology. FGIT 2012. Lecture Notes in Computer Science, vol 7709. Springer, Berlin, Heidelberg
  2. ^ Ahmed E. Hassan and Tao Xie. 2010. Software intelligence: the future of mining software engineering data. In Proceedings of the FSE/SDP workshop on Future of software engineering research (FoSER '10). ACM, New York, NY, USA, 161–166
  3. ^ Renato Novais, José Amancio Santos, Manoel Mendonça, Experimentally assessing the combination of multiple visualization strategies for software evolution analysis, Journal of Systems and Software, Volume 128, 2017, pp. 56–71, ISSN 0164-1212, doi:10.1016/j.jss.2017.03.006.
  4. ^ Software Engineering Rules on code quality.
  5. ^ Victor R. Basili and David M. Weiss. 1984. A Methodology for Collecting Valid Software Engineering Data. IEEE Trans. Softw. Eng. 10, 6 (November 1984), 728–738. doi:10.1109/TSE.1984.5010301
  6. ^ Ahmed E. Hassan and Tao Xie. 2010. Software intelligence: the future of mining software engineering data. In Proceedings of the FSE/SDP workshop on Future of software engineering research (FoSER '10). ACM, New York, NY, USA, 161–166. doi:10.1145/1882362.1882397
  7. ^ Storey MA. (2003) Designing a Software Exploration Tool Using a Cognitive Framework. In: Zhang K. (eds) Software Visualization. The Springer International Series in Engineering and Computer Science, vol 734. Springer, Boston, MA.
  8. ^ Seonah Lee, Sungwon Kang, What situational information would help developers when using a graphical code recommender?, Journal of Systems and Software, Volume 117, 2016, pp. 199–217, ISSN 0164-1212, doi:10.1016/j.jss.2016.02.050.
  9. ^ Linda G. Wallace, Steven D. Sheetz, The adoption of software measures: A technology acceptance model (TAM) perspective, Information & Management, Volume 51, Issue 2, 2014, pp. 249–259, ISSN 0378-7206, doi:10.1016/
  10. ^ Lippert, S.K., & Forman, H. (2005). Utilization of information technology: examining cognitive and experiential factors of post-adoption behavior. IEEE Transactions on Engineering Management, 52, 363–381.
  11. ^ Rajiv D. Banker and Chris F. Kemerer (1992). Performance Evaluation Metrics for Information Systems Development: A Principal-Agent Model. Information Systems Research, volume 3, number 4, 379–400.
  12. ^ M. Crowne, "Why software product startups fail and what to do about it. Evolution of software product development in startup companies," IEEE International Engineering Management Conference, 2002, pp. 338–343 vol.1. doi:10.1109/IEMC.2002.1038454
  13. ^ LaValle S, Lesser E, Shockley R, Hopkins MS and Kruschwitz N (2011) Big data, analytics and the path from insights to value. MIT Sloan Management Review 52 (2), 21–32.
  14. ^ Janez Prašnikar, Žiga Debeljak,Aleš Ahčan (2005) Benchmarking as a tool of strategic management, Total Quality Management & Business Excellence, volume 16, number 2, 257–275, doi:10.1080/14783360500054400
  15. ^
  16. ^ Parnas, David Lorge (2011), Precise Documentation: The Key to Better Software,The Future of Software Engineering, 125–148, doi:10.1007/978-3-642-15187-3_8
  17. ^
  18. ^
  19. ^
  20. ^
  21. ^
  22. ^
  23. ^
  24. ^ G. Timoszuk, R. Dabrowski, K. Stencel, C. Bartoszuk, “Magnify – A new tool for software visualization”, Federated Conference on Computer Science and Information Systems, pp. 1485–1488, 2013
  25. ^