RAISE NEWS

   2019

   2018

   2017

   2016

  • Paper accepted at EMSE

    11/14/2016

    Dr. Menzies’ paper titled “Are Delayed Issues Harder to Resolve? Revisiting Cost-to-Fix of Defects throughout the Lifecycle” has been accepted for publication at EMSE. Author version can also be found here.

  • Dr. Menzies is a keynote speaker at SWAN-2017

    11/13/2016

    Dr. Menzies will be the keynote speaker at 2nd International Workshop on Software Analytics (SWAN 2016). His talk is titled “More or Less: seeking simpler software analytics”. The slides of the talk can be found here

  • Dr. Menzies to serve as co-chair at SSBSE-2017

    10/11/2016

    Dr. Menzies to serve as co-program chair for Symposium on “Search-based Software Engineering. 2017”. Find the flyer to SSBSE’17.

  • Among the top-3 papers at SSBSE-2016

    10/10/2016

    Vivek, Dr. Menzies and Jianfeng paper titled “ An (Accidental) Exploration of Alternatives to Evolutionary Algorithms for SBSE” at Symposium on Search-Based Software Engineering (SSBSE) was adjudged to be among the Top-3 (of 48 submission) . Slides can be viewed here. <img align=left src=”/img/timm_ssbse_16.jpg” height=265 width=400>

  • Dr. Menzies presents at SSBSE-2016

    10/09/2016

    Dr. Menzies presented his paper titled “ An (Accidental) Exploration of Alternatives to Evolutionary Algorithms for SBSE” at Symposium on Search-Based Software Engineering(SSBSE). Slides can be viewed here. <img align=left src=”/img/timm_ssbse_16.jpg” height=265 width=400>

  • Paper accepted at ESE

    09/29/2016

    George and Dr. Menzies’ paper titled “Negative Results for Software Effort Estimation” has been accepted for publication at ESE.

  • Dr. Menzies is a Guest Speaker

    09/20/2016

    Dr. Menzies has been invited to speak at “Big Software on the Run”, winter school, Netherlands on October 27, 2016.

  • Rahul Krishna submits his paper to IST

    09/12/2016

    Rahul submitted his paper titled ‘Recommendations for Intelligent Code Reorganization’ to Journal of Information and Software Technology.

  • Wei Fu submits his paper to IST

    09/08/2016

    Wei Fu submitted his paper titled ‘Why is Differential Evolution Better than Grid Search for Tuning Defect Predictors?’ to Journal of Information and Software Technology.

  • Rahul Krishna presents at ASE-2016

    09/05/2016

    Rahul Krishna presented his paper titled “Too much automation? the bellwether effect and its implications for transfer learning” at International Conference on Automated Software Engineering (ASE 2016). Slides can be viewed here. <img align=left src=”/img/rahul_ase_2016.jpg” height=265 width=400>

  • Three Papers submitted to ICSE'17

    08/28/2016

    Last day for submission to International Conference of Software Engineering-2017 was on Aug 28, 2016. This year we have three very interesting papers, which were submitted by Amrit, George and Dr. Menzies. The papers are: “Trends in Topics at SE Conferences (1993-2013).” “What is Wrong with Topic Modeling?(and How to Fix it Using Search-based SE).” “A deep learning model for estimating story points” Now we wait with our fingers crossed!

  • Jianfeng Chen submits his paper to TSE

    08/27/2016

    Jianfeng Chen submitted his paper titled ‘Is “Sampling” better than “Evolution” for Search-based Software Engineering?’ to Transactions of Software Engineering.

  • Reading Party for ICSE'17

    08/27/2016

    Reading party to critique works of Amrit, George among others. Great papers, good food along with a lots of caffiene. <img align=left src=”/img/icse_reading_party.jpg” height=270 width=480>

  • Foundation of Software Science

    08/18/2016

    Dr. Menzies is teaching a new course - “Foundation of Software Science. This subject will explore methods for designing data collection experiments; collecting that data; exploring that data; then presenting that data in such a way to support business-level decision making for software projects.

  • Funding from LexisNexis

    08/15/2016

    Thanks LexisNexis for sponsoring our BIG SE with a grant (total award: $60K).

  • Funding from NSA

    08/10/2016

    Thanks NSA for sponsoring our privatized data sharing research(Privatized data sharing: Practical? Useful?) with a grant (total award: $85K).

  • Rahul Krishna's paper accepted to ASE

    07/18/2016

    Rahul Krishna’s paper titled “Too Much Automation? The Bellwether Effect and Its Implications for Transfer Learning” is accepted to the 31st IEEE/ACM International Conference Automated Software Engineering 2016. This was a joint work with Dr. Lucas Layman of Fraunhofer Center for Experimental Software Engineering. Here is a link to his paper.

  • REU Camp

    06/29/2016

    RAISE hosted two undergraduate students (Abdulrahim Sheikhnureldin and Matthew J. Martin) over summer’16, where they were involved in project titled ‘The Effect of Code Dependencies on Software Project Quality’ and ‘Enhanced Issue Prediction Using Contextual Features respectively.

  • Vivek Nair's paper accepted to SSBSE

    06/10/2016

    Vivek Nair’s paper titled “An (Accidental) Exploration of Alternatives to Evolutionary Algorithms for SBSE” is accepted to the Symposium on Search-Based Software Engineering - 2016.

  • Congrats to RAISE Members

    06/01/2016

    Congrats to 5 members of RAISE for securing internships from Lexisnexis and ABB.

  • Rahul Krishna's paper accepted to BIG DSE

    05/05/2016

    Rahul Krishna’s paper titled “The “BigSE” Project: Lessons Learned from Validating Industrial Text Mining” is accepted to the BIG Data Software Engineering Workshop, 2016. This was a joint work with Manuel Dominguez, David Wolf of LexisNexis, Raleigh. Here is a link to his paper.

  • Wei Fu's paper accepted to IST journal

    04/29/2016

    Wei Fu’s paper titled “Tuning for software analytics: Is it really necessary?” is accepted to the Journal of Information and Software Technology. This was a joint work with Dr. Xipeng Shen. Here is a link to his paper.

  • The BigSE Project

    02/01/2016

    Mr. Krishna submits his paper titled “The “BigSE” Project: Lessons Learned from Validating Industrial Text” to BIGDSE. This is a joint work with Mr. Yu, Mr. Agarwal, Dr. Menzies, Mr. Manuel Dominguez and Mr. David Wolf. Title: The BigSE Project: Lessons Learned from Validating Industrial Text Mining Abstract: As businesses become increasingly reliant on big data analytics, it becomes increasingly important to test the choices made within the data miners. This paper reports lessons learned from the BigSE Lab, an industrial/university collaboration that augments industrial activity with low-cost testing of data miners (by graduate students). BigSE is an experiment in academic/ industrial collaboration. Funded by a gift from LexisNexis, BigSE has no specific deliverables. Rather, it is fueled by a...

   2015

  • Dr. Menzies talk AT CREST Open Workshop

    11/24/2015

    Dr. Menzies is one of the speakers at The 44th CREST Open Workshop - Predictive Modelling for Software Engineering. The talk is titled “Predicting What Follows Predictive Modeling”. Slides can be viewed here.

  • Relax! Most stats yields the same results

    10/20/2015

    Dr. Menzies submits his paper titled “On the Value of Negative Results in Software Analytics” to Empirical Software Engineering. This is a joint work with Dr. Ekrem Kocaguneli. Title: On the Value of Negative Results in Software Analytics Abstract: When certifying some new technique in software analytics, some ranking procedure is applied to check if the new model is in fact any better than the old. These procedures include t-tests and other more recently adopted operators such as Scott-Knott. We offer here the negative result that at least one supposedly “better” ranking procedure, recently published in IEEE Transactions on Software Engineering, is in fact functionally equivalent (i.e. gives the same result) as some much simpler and older procedures. This negative...

  • Older methods just as good or better than anything else

    10/20/2015

    Dr. Menzies submits his paper titled “Negative Results for Software Effort Estimation” to Empirical Software Engineering. This is a joint work with Dr. Ye Yang, Mr. George Mathew, Dr. Barry Boehm and Dr. Jairus Hihn. Title: Negative Results for Software Effort Estimation Abstract: Context: More than half the literature on software effort estimation (SEE) focuses on comparisons of new estimation methods. Surprisingly, there are no studies comparing state of the art latest methods with decades-old approaches. Objective: To check if new SEE methods generated better estimates than older methods. Method: Firstly, collect effort estimation methods ranging from “classical” COCOMO (parametric estimation over a pre-determined set of attributes) to “modern” (reasoning via analogy using spectral-based clustering plus instance and feature selection)....

  • Dr. Menzies talk AT University of Notre Dame

    10/06/2015

    Dr. Menzies is to talk to the computer students at University of Norte Dame. The talk is titled “The Future and Promise of Software Engineering Research”. This posting for the talk. Slides can be viewed here.

  • Dr. Menzies delivers a talk at HPCC summit 2015

    09/29/2015

    Dr. Menzies delivers a talk titled “Big Data: the weakest link “ at HPCC summit 2015. He was also a part of a panel discussion on “Grooming Data Scientists for Today and for Tomorrow”. Congratulations to Dr. Menzies for winning an award for his outstanding contribution to the HPCC community. <img align=left src=”/img/DrM_hpcc_talk.png”> <img align=left src=”/img/DrM_hpcc_panel.png”> Dr. Menzies says “I want a scientist. I want someone who actually doubts their own conclusions vigorously.”

  • Welcome Dr. Dam

    09/27/2015

    We are very happy to host fellow researcher, Dr. Dam, from down under(Australia). Dr Hoa Khanh Dam is a Senior Lecturer at the School of Computing and Information Technology, University of Wollongong, Australia. The lab is excited to learn from his experience with requirements engineering and effort estimation in AGILE settings. <img align=left src=”/img/Dr.Hoa_Dam.png”>

  • Mr. Rahul Krishna submits his paper to ICSE'16

    08/28/2015

    Mr. Rahul Krishna submits his paper titled “How to Learn Useful Changes to Software Projects (to Reduce Runtimes and Software Defects)” to ICSE 2016. This is a joint work with Dr. Xipeng Shen, Andrian Marcus, Naveen Lekkalapudi and Lucas Layman. For more see notes. Title: How to Learn Useful Changes to Software Projects (to Reduce Runtimes and Software Defects) Abstract: Business users now demand more insightful analytics; specifically, tools that generate “plans”– specific suggestions on what to change in order to improve the predicted values. This paper proposes XTREE, a planner for software projects. XTREE receives tables of data with independent features and a corresponding weighted class which indicates the quality (“bad” or “better”) of each row in the table....

  • Mr. Wei Fu submits his paper to ICSE'16

    08/28/2015

    Wei Fu submits his paper titled “Tuning for Software Analytics: is it Really Necessary?” to ICSE 2016. This is a joint work with Dr. Xipeng Shen. For more see notes.

  • Dr. Tim Menzies submits his paper to ICSE'16

    08/28/2015

    Dr. Menzies submits his paper titled “Live and Let Die? (Delayed Issues not Harder to Resolve)” to ICSE 2016. This is a joint work with Dr. William R. Nichols, Dr. Forrest Shulland Dr. Lucas Layman. Title: Live and Let Die? (Delayed Issues not Harder to Resolve) Abstract: Many practitioners and academics believe in a delayed issue effect (DIE); i.e. as issues linger longer in a system, they become exponentially harder to resolve. This belief is often used to justify major investments in new development processes that promise to retire more issues, sooner. This paper tests for the delayed issue effect in 171 software projects conducted around the world in the period from 2006–2014. To the best of our knowledge, this...

  • Laws of trusted data sharing

    08/10/2015

    A repeated, and somewhat pessimistic, conclusion is that the more we privitize data, the more we lose the signal in that data. That is, the safer the data (for sharing) the worse it ebcomes (for making conclusions) Recent results have addressed this issue. Former RAISE-member (now working on her post-doc) Fayola Peters presented her novel privacy algorithms called LACE2. In recent work with Dr. Tim Menzies, presented at the International Confernce on Software Engineering, Dr Peters applies instance-based learning methods to learns how much (and how litte) we can mutate data without changing the conclusions we might learn from that data. Based on that work, she offers three laws of trusted data mining. To explain our three laws, we must...

  • LexisNexis to fund AI lab

    06/09/2015

    For more, see briefing notes

  • HPC Cluster Access

    02/26/2015

    We now have HPC accounts, which gives us access to 1000+ 8-core machines! For more see tutorial.