Saturday, August 6, 2022

7th of August

Marks Rpi Cluster continues to run 24/7. The three newer Pi4's have built up their Recent Average Credit for einstein@home to 1000 and some of the existing Pi4's are up to 1300 credits. It takes about a month to stabilise so pretty good for two weeks of running them in the EC12.


Validation failure rate
I looked at the validation failures I was getting. That is where I compute a work unit and it doesn't agree with other computers that have also done the same work unit.

I've been running an ARM64 optimized version of the BRP4 app written by a volunteer with the handle N30DG. I ran the project supplied 1.61 app for a few days on a single Pi4. I checked the average validation failure rate for all computers. It was around 10 to 11% from einstein@home's server status page. It varies a bit from day to day. My validation failures are around 5%, so better than most.

The optimized BRP4 app is taking slightly over 3 hours. The project supplied 1.61 app takes around 4 and a half hours.


Another EC12
I ordered another Edge Cluster 12. BitScope had stock of the Pi4 8GB at the same price as my usual suppliers so I ordered 12 to go in it. I should get them next week. Hopefully the Pi4's will be the 1.8GHz version.

I received another couple of Meanwell power supplies that I ordered online from an electrical distributor. The Meanwell power supplies are designed to drive LED lights, but I happen to use them to power the EC12 and before that the Blade Rack 20.

I need more 32GB MicroSD cards. I have a few spares as they tend to wear out, but I don't have 12 of them. I should be fine for network cables and switches.

Sunday, July 31, 2022

31st of July

Marks Rpi Cluster is running 24/7. After a break to migrate the Pi4's from their cases to the Edge Cluster 12 they are now running flat-out doing Einstein BRP4 work. Einstein had the BRP4 work turned off at the beginning of the week so I ended up running their FGRP5 work which were taking 25 hours a work unit. The BRP4's are much more digestible at 3 hours and 5 minutes.

BitScope do make larger versions than the EC12, but that was all the Pi4's I had on hand and its a reasonable size to deal with. Most of the Pi's are running between 53 and 59 degrees C so airflow doesn't seem to be an issue. That brings Mark Rpi Cluster to the following configuration:

12 x Pi4 8GB as compute nodes (in the EC12).
3 x Pi4 2GB as support nodes.

I have ordered another couple of Meanwell 24v power supplies. I am thinking I might get another EC12, assuming I can find some Pi4's to put in it. The Pi4's are out of stock everywhere at the moment.


Tuesday, July 26, 2022

BitScope Edge Cluster 12

 Introducing the BitScope Edge Cluster 12. 

 

That is the parts you get. You just need to add your own Raspberry Pi's and a suitable power supply.


There are the two end plates, one ground (black) and the other live (red). The green plug is the power plug.


These 3 plates are what you plug the Rpi into. They refer to them as Cluster Plates. Each one has power regulation circuitry on the end to ensure reliable power. The Pi's push in a GPIO socket on the other side. There are 4 nylon stand-off's and nylon screws to secure each Pi although the GPIO pins are tight enough on their own. The fans are there for each individual Pi

 

This is the back of it with the two 80mm fans and grills.

And this is the front where all the connections go. The thick black cable on the left is from my power supply (a Meanwell 24v 10a output unit) and plugs into the live plate. There is a plastic cover to go on the front, however I need to drill a hole in it so I can feed the power plug into the unit which I consider a design flaw. The power connector butts up against the rear plastic panel.

Another design feature, if you need to replace the SD cards, one has to remove the back cover with the two 80mm fans.

Problems I encountered while assembling it was missing nylon stand-offs on one of the cluster plates. I ended up taking a stand-off out of my old Pi cluster.

The top metal plate is supposed to have 4 screw holes with threads, however mine had 4 holes but only 3 had threads. The 4th one appears to have been drilled out as its larger, so I could only use 3 screws to mount it.

I now have all 12 Pi4 8GB in here and running. The fans keep spinning up and down which is a little noisy but not too bad.


Update 7th of August 2022

Spoke with BitScope and they offered to replace the top plate and missing nylon stand-off's as they are considered manufacturing defects.

The design flaw I mentioned they are saying is due to my assembling it the wrong way. The live plate is supposed to have the power connector flush when the front panel is fitted. The front is where the network and USB ports are facing.

As for the fans, the fan curve can be updated or simply setting them to a fixed speed. The ramp up and down seemed to be unique to the einstein FGRP5 work units. The BRP4 work when run doesn't have the same issue so its workload related.

 

Saturday, July 9, 2022

9th of July

Marks Rpi Cluster continues running 24/7, well at least the Pi4's are. The Pi3's get to run on weekends. The cluster is concentrating on the Einstein@home BRP4 work.

I mentioned in my last post that I had ordered a BitScope Blade for the Pi4's. It was supposed to be ready in a couple of weeks but its been almost a month so I will have to chase them up.

The Raspberry Pi foundation seem to be giving the libcamera-apps-lite package in their latest updates. I am running the lite version of Raspberry Pi OS which doesn't have a desktop. The camera-apps package is going to install desktop programs which are useless on a headless Pi. To top it off I don't have a Pi camera. I had to hold the package with a sudo apt-mark hold libcamera-apps-lite command to prevent it installing.


Update: 10th of July

Debian have done a point release so there are a number of updates to apply to the cluster.