Open Source Stack
From development to production, many folks are interested in diversifying the tech stack using open source tools. Questions were raised about using:
- PostgreSQL or MySQL instead of SQL Server for the ODS
- .NET Core/Standard instead of .NET Framework for the API and related utilities
- Visual Studio Code instead of Visual Studio for compilation
The Ed-Fi ODS/API developers mentioned that moving to .NET Core fits the 80/20 rule: mostly very easy (80%), but there will be some time-consuming drudge work (20%) to take it the last mile. Similarly, there are challenges around the current code generation, and database access using Entity Framework and NHibernate, that are problematic for non-SQL Server use. On the positive side, leveraging MetaEd may help with the 80% "easy" move to another database platform. Moving to Visual Studio Code seems like an easier lift. However, the code generation process currently requires Visual Studio and the Visual Studio SDK.
Notifications / Pub-Sub Model
Getting actionable data directly into teachers' hands can sometimes be a challenge. Is there any possibility of creating an Ed-Fi notification system? This question broadened out into a brief discussion of a few approaches for event handling:
- Publish / subscribe (pub-sub) architecture whereby the ODS/API might publish certain events that subscribers could process to create appropriate notifications.
- Webhooks that allow the API to call back to a vendor or district after processing a request.
- The newly named Change Queries (neé Change Events) allows systems to look for new and modified records.
The original question seemed to imply tracking and reacting to metrics - for example, pushing an early warning notification to a teacher when a student's attendance falls below a certain threshold. It was noted in the conversation that this type of action might be best left to a business intelligence platform, whether with a home-grown solution or a vendor-built one. Webhooks can be appropriate for responding to API requests that require longer processing – and indeed are included in the next generation bulk API design. The sense was that this technology and topic has a lot of promise, but there is still substantial work to do in terms of understanding community use cases, needs, and priorities.
Analytics Middle Tier
The response to this new initiative, which is currently an Exchange contribution, was very encouraging. The Alliance will continue to evaluate directions and also communicated at the event that it welcomes community contributions of additional generic dimensional views and use case-specific views.
Stakeholders want to see the effort continue beyond a one-shot proof-of-concept, with expansion into other domains and use cases. There was a suggestion to create a dictionary of metric calculations that can be performed using the middle tier views, especially those that are universally applicable (or nearly so). Another request is to ensure the presence of a District Key in relevant views so that they can be used in multi-tenant (district) installations.
Improving the quality of data coming into the ODS is still on the minds of many. In 2017 the Technical Advisory Group reviewed the idea of introducing a “Tier 1” validation layer (Tier 1 refers to validating the data before it lands in an API) and recommended against it due to the complexity of requirements for vendor interoperability.
The topic of community collaboration on validation engines which would function to provide Tier 2 validation (after the data have landed in the API) was also raised. The states of New Mexico and Michigan each have home-grown validation engines, which might be shared with the community via the Ed-Fi Exchange.
Arizona has a mechanism for tagging the source of data that provides a traceability for LEA/SEA collaboration. Several people expressed an interest in seeing that mechanism be moved into the Ed-Fi Core. The development team would need to better understand the use cases before considering this. Another approach for traceability may be to expand the existing logging in the API.