Currently the Mac Audio Driver for Pioneer’s Rekordbox fails to install on Big Sur. This is because Big Sur has upgraded to perl 5.28, which restricts the way it reads files. See Error’s below on this. The good news is that your system should also include perl 5.18. However it’s difficult to change versions, without downgrading the system security around /usr. So instead of changing defaults, I’ve chosen to instead

The FileStation UI will not let you create a softlink. You will only be able to create “shortcuts,” which are either under favorites, or desktop. However, since Synology is basically an Alpine based Linux System, it is simple to log into the box through SSH and create a symlink using the command: This will easily create a softlink. However, softlinks don’t show up in the FileStation UI. So not very

Use Case I would like to monitor events from a Synology NFS mount on another system. Solution Python module to monitor and forward filesystem events to an AWS SQS queue, that my external system can monitor and act upon Install Python 3 on Synology To do this, visit the Synology Package Center, search for “python” and install   Then ssh into your Synology device and verify that python3 is available,

Most development projects rely on protected, external resources, such as databases, or rest services; and many times, for the sake of simplified testing, we add those credentials to our configuration files, which, if accidentally leaks to the wrong person, can become a painful and expensive issue. In this article, I will demonstrate how to avoid these issues in a Java, Maven development environment. Password leaks can be avoided by simply

A 5min POC using google sheets in Zeppelin. Testing how easy it would be to drop data into a spreadsheet, then analyze it using Zeppelin. First go into https://docs.google.com and create a new spreadsheet, which would be used for collecting your data. Alternatively, my copy of bank.csv, which is used in this example, can be cloned for yours. Next, you will want to create a shareable link for this spreadsheet,

TL;DR : Using keycloak as an IDM or LDAP Domain Aggregator Download the APS Identity Sync Extension: https://github.com/alex4u2nv/aps-ais-authority-sync/releases/download/v1.0.0/aps-identity-sync-java-1.0.0-jar-with-dependencies.jar Configure APS to Integrate with Keycloak as in the example activiti-identity-service.properties Configure Keycloak to integrate with multiple LDAP domains via User Federation service. Authenticate into APS using users that were synchronized. If Keycloak authentication is enabled, then authenticate through keycloak If other authentication methods bounded to same user ids (email address) then use

Quick Steps Walkthrough This walk through is targeted for audiences who are new to Vault, or dev ops who just need an API to develop auto deployment scripts against. A production environment should be installed and operated by a Hashicorp Vault expert. Pull and Run Pull the docker image and run it in the foreground with exposed ports 8200 using the following command: docker pull vault docker run --cap-add=IPC_LOCK -p

Extending ACS 6 docker images After you have your ACS 6 local environment running, you’re probably thinking, this is nice, but I want to deploy my favorite amps, like JS console, Alfresco Governance Services (AGS), or even custom developed amps developed with Alfresco SDK.

Get ACS 6 EA on your local environment and start exploring new features, functionalities, and services.

By default Alfresco Content Services sets a default search limit, based on ACL checks to 1000 items. In order to search for more than 1000 items, you will need to do one of two things: However, before you make these changes, you should consider the use case behind your search requirement as a global change will allow users to run some very long accidental wildcard queries. If search is to

Loop through a result set, and execute an action on the objects. This example uses

Audit Replication of Alfresco Content Services (ACS) to Elastic Search using Spring Boot and Apache Camel. This project uses a Pull/Push integration model, where the ACS audit stream is pulled from the Rest API, and pushed over to Elastic Search. Once audit data is in Elastic Search, the Kibana UI can plug in to generate dashboards and charts based on audit actions inside of Alfresco Content Services.