Re: Kegakigeji ケガキゲージ
Posted: Fri Jul 31, 2015 7:45 am
Mathieu,
aw, I was just joking around with the self deprecation in my previous post. I wasn't actually feeling weak due to the prospect of buying the Mitsui gage, though maybe I should hold off on that assessment until I see what's left of my bank balance.
Good to hear your positive recommendation for the tool, as I just placed an order. I'd seen these tools a while back but had never paid them a lot of attention. I realize now how one of these kegaki gages could be a tool in frequent use in my shop, as I often use a the end of the combo square blade as a place to lay out lines along the grain of a timber, and occasionally cross-wise to it as well. Having a long and gap-free registration surface for the pencil will be decidedly advantageous. The groove along one side of the combo square - a place my pencil often wants to fall into and I'd rather it didn't do that - is something I have long found to be a bit of a fly in the ointment.
It's still handy to use the end of the combo square blade when it comes to laying out narrower surfaces like tenons, or in certain other confined spaces. There the width of the Mitsui tool might make it less suited to laying out narrower surfaces. Hmm, I wonder if they make a narrower version? I'll try it first and see before thinking about that further.
Inch scale has certain advantages , just like every measuring system has plusses and minuses.
If you'll forgive me Id like to expand on that a bit...
The usual argument one might hear is that metric is more 'rational' by virtue of being decimalized.
I'm not so sure about that argument.
First off, consider that inches which are divided can into tenths and twentieths, as they are on the sashigane I designed, gives a resolution as fine as 1/20 (0.05") of an inch. You can clearly see the 1/20" increment marks, and even interpolate between them, so one could say the tool gives a visual resolution to a max of 1/40" (0.025").
Regular inch-scale measuring tools, like tape measure, ruler, framing square, give a resolution by their usual divisions, indicated as 1/16" (0.0625"), 1/32" (0.03125") and so on. Measuring tapes stop at 1/32" resolution typically. With an etched division ruler, we can see 1/64th" (0.015625") ticks on the ruler - if you have reasonable eyesight. A 1/100" resolution, if you've got really sharp eyesight). Using a scratch awl to transfer marks against the etched line obtains the maximum level of physical marking out accuracy with a ruler.
Let's consider the metric-scale tool equivalents. In woodworking you would work with a naked eye resolution and measuring tools would be marked in 1/10cm divisions, correct?
1/10cm, 1 millimeter, is equal to 0.03937". The next division possible would be 1/100 cm, a tenth of a millimeter (0.003937"), but that is too fine a division to be seen by the naked eye, let alone used to mark out on the tool's surface. Not too useful as a measure for woodworking at all, unless you like to use a loupe with your ruler. Any 'seeing' at that resolution is more likely to be done with the aid of a caliper with 1/10 of a millimeter resolution or better. If not using such calipers in your woodworking, you limit of accuracy for measurement is given by a tool with millimeter markings.
That's a significant issue I think. Woodworking, throughout most of history has relied upon naked eyesight when it comes to setting measures. Rules and framing squares and tapes have a display using fractional increments, ticks along the edge we inspect when putting a number to something.
How fine are those divisions and what level of relative accuracy do they provide?
With a metric rule or square, the resolution the 1 millimeter increment on the tools provides 0.03937" accuracy.
With an inch-scale marked tool, the 1/32", and 1/64" divisions would be the ones used. Those divisions provide 0.03125" and 0.015625" refinement of accuracy. Both divisions are more accurate than 1mm divisions.
One could argue that it is possible to visually interpolate the spaces between the 1/10mm tick marks on the tool, to 1/20mm refinement. That equals 0.019685", two hundredths of an inch, which still not as good as the 0.015625", 1/32" accuracy you obtain with the usual inch-scale rule, framing square, etc., looked at with the naked eye.
And then there is what happens when you get into finer measurements yet. Let's say you obtain a metric caliper which measures to the next highest level of accuracy up from what is possible with the tape or ruler markings you see with your naked eye, to a display having a resolution of 0.01mm. Or you go even further, move one step up in accuracy and get a caliper with display measuring to 0.001mm.
Now, 1/100mm, 0.01mm, resolution equals 0.003937", about 4-thousandths of an inch, while the next step up in accuracy, 0.001mm equals 0.0003937".
The former distance value (0.003937") would be useful perhaps in machinist's work, though one could see it useful for woodworking also. It provides roughly 0.004" of resolution, which is decent for sure. I think one can work to a higher level of resolution with powered or hand tools, if one chooses and/or is able.
Now, let's say you get a digital caliper in inch-scale, how would that, uh, measure up? A caliper measuring 1/10" would be pointless, as you can already make out 1/64" or 1/100" on the tool with the naked eye. We would therefore be looking for a tool giving better than 1/100" accuracy.
The next step up, the place for an inch-scale caliper would be a tool providing 1/1000" (0.001") accuracy. The effective level of resolution is 0.001" on the tool, though the readout on my Digimatic caliper has another place past the decimal, but it is not actually accurate to 0.0001"" but is 'only' +/- 0.0005" accurate. Plenty accurate for anything I might conceivably do in the wood shop or machine shop. I would argue that a tenon fit is noticeably different given a dimensional difference of plus or minus 0.002", in hard woods. In other words, there are physically detectable differences in the joinery at those levels of resolution. Whether level of difference that matters to you or not, or is something you care to attain, is another matter. How well fitted would you like a joint to be? Somewhere between sloppy and too tight I guess....
Here we see the inch-scale tool giving a greater level of resolution than is attainable with the corresponding metric-scale tool. The metric tool has a 0.01mm (0.003937") accuracy. The inch-scale tool you would use in its place measures to 0.001" reliably, and that is 4 times more accurate than the metric tool. It is inherently more accurate simply by the virtue what goes on when we divide the respective units up, and the useable tools we have at each step up in accuracy.
Now, let's say you have a metric caliper which can measure with 0.001 accuracy, a divide equal to 0.0003937". That is the place where we move from metal working and super-precision woodworking, to the realm of accuracy required for telescope building, gyroscopes, rockets, atomic physics, and the like. Only specialized machine shops in 'laboratory clean' conditions would be working to that level of accuracy.
For comparison, what's the next level up in accuracy for an inch-scale caliper?
Well, that would be a caliper measuring to 0.00005", a resolution 12 times more accurate than the metric equivalent tool having 0.001mm (0.0003937") resolution.
At every point of comparison, the inch-scale itself has an advantage in being a larger physical size than the centimeter. For old school work in a wood shop where we wouldn't be using digital calipers, relying instead upon divisions inked or notched into the edge of the tool. The inch scale provides divisions which allow for more accurate work than what you would obtain with the metric-scale tool equivalent. When you look at the next level of accuracy obtainable with either tool, using calipers of some sort, the inch-scale tool wins out both in useful and absolute accuracy at each jump in resolution.
Everyone will agree that working in a decimal-based manner, as you do in metric, makes a lot of sense.
You can work readily in decimal inches if you like with conventional layout tools though.
The more accurate you seek to work, the more you will work in decimals anyhow, and the less that other fractions are involved. When you want to get accurate, in either system, you start working in decimal inches.
It's very natural though to take a given measurement and, in the effort to mark finer and finer units upon it, we divide the initial distance in half, then divide each part in half again, and so on and so forth. That's one point in favor of divisions like 1/2, 1/4, 1/8, 1/16, 1/32nd and so on. They are natural to us. Such fractions are an obvious and natural way of refining measurements, particularly when you are looking at people using the systems who lacked numeracy.
If you were working at the most primitive level, which task would you rather have - divide a distance into half, then half again, then half again..., or divide it into 10 parts? The former task, divided-divide-divide, and so on, is a natural task to undertake, while the latter, dividing a measure by 10 parts, requires some tricks to achieve. Picking 1/10 divisions is not the first solution you would come to in other words.
If we consider the tools we use by squinting at a mark along an edge, the most accurate we can be is decided by what we can discern by the naked eye.
If you stick with the standard divisions those tools have, comparing metric-scale and inch-scale, at each step along the road in accuracy, the inch-scale tools would appear to me to have advantages in useful accuracy at each stop along the line. These relative advantages leaves the 'decimalization advantage' of metric as being somewhat of a moot point. Any highly accurate inch-scale measuring we undertake is going to be in decimalized values, so the decimal advantage is only arguable relative to the measures we handle in the regular, naked-eye visible measures we use. The inch-scale tools are more natural in how the measures are divided, and are inherently more accurate in the refinement in space those divisions can provide, and with a variety of divisions available to use, are also more versatile. Metric provides only one division, by 10 each time, and this works out to be a disadvantage I would argue in a lot of cases.
Wrote more than I initially intended - excuse the long-windedness.
aw, I was just joking around with the self deprecation in my previous post. I wasn't actually feeling weak due to the prospect of buying the Mitsui gage, though maybe I should hold off on that assessment until I see what's left of my bank balance.
Good to hear your positive recommendation for the tool, as I just placed an order. I'd seen these tools a while back but had never paid them a lot of attention. I realize now how one of these kegaki gages could be a tool in frequent use in my shop, as I often use a the end of the combo square blade as a place to lay out lines along the grain of a timber, and occasionally cross-wise to it as well. Having a long and gap-free registration surface for the pencil will be decidedly advantageous. The groove along one side of the combo square - a place my pencil often wants to fall into and I'd rather it didn't do that - is something I have long found to be a bit of a fly in the ointment.
It's still handy to use the end of the combo square blade when it comes to laying out narrower surfaces like tenons, or in certain other confined spaces. There the width of the Mitsui tool might make it less suited to laying out narrower surfaces. Hmm, I wonder if they make a narrower version? I'll try it first and see before thinking about that further.
Inch scale has certain advantages , just like every measuring system has plusses and minuses.
If you'll forgive me Id like to expand on that a bit...
The usual argument one might hear is that metric is more 'rational' by virtue of being decimalized.
I'm not so sure about that argument.
First off, consider that inches which are divided can into tenths and twentieths, as they are on the sashigane I designed, gives a resolution as fine as 1/20 (0.05") of an inch. You can clearly see the 1/20" increment marks, and even interpolate between them, so one could say the tool gives a visual resolution to a max of 1/40" (0.025").
Regular inch-scale measuring tools, like tape measure, ruler, framing square, give a resolution by their usual divisions, indicated as 1/16" (0.0625"), 1/32" (0.03125") and so on. Measuring tapes stop at 1/32" resolution typically. With an etched division ruler, we can see 1/64th" (0.015625") ticks on the ruler - if you have reasonable eyesight. A 1/100" resolution, if you've got really sharp eyesight). Using a scratch awl to transfer marks against the etched line obtains the maximum level of physical marking out accuracy with a ruler.
Let's consider the metric-scale tool equivalents. In woodworking you would work with a naked eye resolution and measuring tools would be marked in 1/10cm divisions, correct?
1/10cm, 1 millimeter, is equal to 0.03937". The next division possible would be 1/100 cm, a tenth of a millimeter (0.003937"), but that is too fine a division to be seen by the naked eye, let alone used to mark out on the tool's surface. Not too useful as a measure for woodworking at all, unless you like to use a loupe with your ruler. Any 'seeing' at that resolution is more likely to be done with the aid of a caliper with 1/10 of a millimeter resolution or better. If not using such calipers in your woodworking, you limit of accuracy for measurement is given by a tool with millimeter markings.
That's a significant issue I think. Woodworking, throughout most of history has relied upon naked eyesight when it comes to setting measures. Rules and framing squares and tapes have a display using fractional increments, ticks along the edge we inspect when putting a number to something.
How fine are those divisions and what level of relative accuracy do they provide?
With a metric rule or square, the resolution the 1 millimeter increment on the tools provides 0.03937" accuracy.
With an inch-scale marked tool, the 1/32", and 1/64" divisions would be the ones used. Those divisions provide 0.03125" and 0.015625" refinement of accuracy. Both divisions are more accurate than 1mm divisions.
One could argue that it is possible to visually interpolate the spaces between the 1/10mm tick marks on the tool, to 1/20mm refinement. That equals 0.019685", two hundredths of an inch, which still not as good as the 0.015625", 1/32" accuracy you obtain with the usual inch-scale rule, framing square, etc., looked at with the naked eye.
And then there is what happens when you get into finer measurements yet. Let's say you obtain a metric caliper which measures to the next highest level of accuracy up from what is possible with the tape or ruler markings you see with your naked eye, to a display having a resolution of 0.01mm. Or you go even further, move one step up in accuracy and get a caliper with display measuring to 0.001mm.
Now, 1/100mm, 0.01mm, resolution equals 0.003937", about 4-thousandths of an inch, while the next step up in accuracy, 0.001mm equals 0.0003937".
The former distance value (0.003937") would be useful perhaps in machinist's work, though one could see it useful for woodworking also. It provides roughly 0.004" of resolution, which is decent for sure. I think one can work to a higher level of resolution with powered or hand tools, if one chooses and/or is able.
Now, let's say you get a digital caliper in inch-scale, how would that, uh, measure up? A caliper measuring 1/10" would be pointless, as you can already make out 1/64" or 1/100" on the tool with the naked eye. We would therefore be looking for a tool giving better than 1/100" accuracy.
The next step up, the place for an inch-scale caliper would be a tool providing 1/1000" (0.001") accuracy. The effective level of resolution is 0.001" on the tool, though the readout on my Digimatic caliper has another place past the decimal, but it is not actually accurate to 0.0001"" but is 'only' +/- 0.0005" accurate. Plenty accurate for anything I might conceivably do in the wood shop or machine shop. I would argue that a tenon fit is noticeably different given a dimensional difference of plus or minus 0.002", in hard woods. In other words, there are physically detectable differences in the joinery at those levels of resolution. Whether level of difference that matters to you or not, or is something you care to attain, is another matter. How well fitted would you like a joint to be? Somewhere between sloppy and too tight I guess....
Here we see the inch-scale tool giving a greater level of resolution than is attainable with the corresponding metric-scale tool. The metric tool has a 0.01mm (0.003937") accuracy. The inch-scale tool you would use in its place measures to 0.001" reliably, and that is 4 times more accurate than the metric tool. It is inherently more accurate simply by the virtue what goes on when we divide the respective units up, and the useable tools we have at each step up in accuracy.
Now, let's say you have a metric caliper which can measure with 0.001 accuracy, a divide equal to 0.0003937". That is the place where we move from metal working and super-precision woodworking, to the realm of accuracy required for telescope building, gyroscopes, rockets, atomic physics, and the like. Only specialized machine shops in 'laboratory clean' conditions would be working to that level of accuracy.
For comparison, what's the next level up in accuracy for an inch-scale caliper?
Well, that would be a caliper measuring to 0.00005", a resolution 12 times more accurate than the metric equivalent tool having 0.001mm (0.0003937") resolution.
At every point of comparison, the inch-scale itself has an advantage in being a larger physical size than the centimeter. For old school work in a wood shop where we wouldn't be using digital calipers, relying instead upon divisions inked or notched into the edge of the tool. The inch scale provides divisions which allow for more accurate work than what you would obtain with the metric-scale tool equivalent. When you look at the next level of accuracy obtainable with either tool, using calipers of some sort, the inch-scale tool wins out both in useful and absolute accuracy at each jump in resolution.
Everyone will agree that working in a decimal-based manner, as you do in metric, makes a lot of sense.
You can work readily in decimal inches if you like with conventional layout tools though.
The more accurate you seek to work, the more you will work in decimals anyhow, and the less that other fractions are involved. When you want to get accurate, in either system, you start working in decimal inches.
It's very natural though to take a given measurement and, in the effort to mark finer and finer units upon it, we divide the initial distance in half, then divide each part in half again, and so on and so forth. That's one point in favor of divisions like 1/2, 1/4, 1/8, 1/16, 1/32nd and so on. They are natural to us. Such fractions are an obvious and natural way of refining measurements, particularly when you are looking at people using the systems who lacked numeracy.
If you were working at the most primitive level, which task would you rather have - divide a distance into half, then half again, then half again..., or divide it into 10 parts? The former task, divided-divide-divide, and so on, is a natural task to undertake, while the latter, dividing a measure by 10 parts, requires some tricks to achieve. Picking 1/10 divisions is not the first solution you would come to in other words.
If we consider the tools we use by squinting at a mark along an edge, the most accurate we can be is decided by what we can discern by the naked eye.
If you stick with the standard divisions those tools have, comparing metric-scale and inch-scale, at each step along the road in accuracy, the inch-scale tools would appear to me to have advantages in useful accuracy at each stop along the line. These relative advantages leaves the 'decimalization advantage' of metric as being somewhat of a moot point. Any highly accurate inch-scale measuring we undertake is going to be in decimalized values, so the decimal advantage is only arguable relative to the measures we handle in the regular, naked-eye visible measures we use. The inch-scale tools are more natural in how the measures are divided, and are inherently more accurate in the refinement in space those divisions can provide, and with a variety of divisions available to use, are also more versatile. Metric provides only one division, by 10 each time, and this works out to be a disadvantage I would argue in a lot of cases.
Wrote more than I initially intended - excuse the long-windedness.