Syntax @Generalize([HANDLE_ARCS_ELLIPSES_AND_TEXT,]Deveau, ,,) @Generalize([HANDLE_ARCS_ELLIPSES_AND_TEXT,]Douglas,) @Generalize([HANDLE_ARCS_ELLIPSES_AND_TEXT,]Thin,) @Generalize([HANDLE_ARCS_ELLIPSES_AND_TEXT,]ThinNoPoint,) @Generalize([HANDLE_ARCS_ELLIPSES_AND_TEXT,]McMaster, ,) @Generalize([HANDLE_ARCS_ELLIPSES_AND_TEXT,]McMasterWeightedDistance, ,,) @Generalize([HANDLE_ARCS_ELLIPSES_AND_TEXT,]Curvefit, ,,,,) @Generalize([HANDLE_ARCS_ELLIPSES_AND_TEXT,]NURBfit, [,]) @Generalize([HANDLE_ARCS_ELLIPSES_AND_TEXT,]Inflection,) Arguments The tolerance used for point thinning or generalization. For the Douglas algorithm, points less than the tolerance from the surrounding line segment are removed. The result is that all points of the original segment will be within a band of width centered around the resulting line. For the Deveau algorithm, points are kept only when no band of width can be found that contains both the original points and the resulting line. The band is allowed to float, resulting in a smoother result. For the Thin and ThinNoPoint algorithms, points are removed such that the distance between adjacent spaces is guaranteed to be larger than . When a value of 0 is given for the tolerance, only redundant nodes are removed. ("Redundant" nodes are defined this way: when a redundant node is removed, it does not change the shape of the line in any way.) Range: >= 0 For the Deveau algorithm, this controls the number of simultaneous wedges considered when forming floating bands around the points in the set. When is 1, the Deveau algorithm functions in the same way as the Douglas algorithm. The larger this value is, the more aggressive the generalization and the smoother the resulting line. Range: 1..30 This parameter sets the sharpness tolerance for spikes that will be blunted.Vertex points at angles less than from the previous two points are not moved. The angle is measured in degrees. A value of 110 is recommended. Range: Between 0 and 180 For the McMaster and McMasterWeightedDistance algorithms, each point is smoothed by taking an average of its x and y coordinates and the x and y coordinates of neighboring points. This parameter determines how many neighbors to the left and right of each point are considered. Range: >= 1 For the McMaster and McMasterWeightedDistance algorithms, after a point has been given a new location through averaging, it is then moved towards its original location by an amount specified by this parameter. Note: This parameter is a percentage, so a value of 50 will place the point directly in the middle of its averaged location and its original location. Range: Between 0 and 100 For the McMasterWeightedDistance algorithm, this parameter is used as the power in an inverse distance weighting formula. A value of zero will give each point in the averaging calculations equal weight. A value from 0 to 3 is recommended. Range: >= 0 For the Curvefit algorithm, this sets the maximum deviation allowed at any point along the polyline between the original and the resulting polyline. Range: > 0 For the Curvefit algorithm, this allows very flat curves to be represented by straight segments. Any curve that has a mid-ordinate less than this amount will be replaced by a straight segment. A typical value is 10% of the precision setting. Range: >= 0 For the Curvefit algorithm, this determines the importance given to Compression relative to Smoothness and Accuracy. Compression is the reduction in the number of vertices Range: > 0, <= 10 For the Curvefit algorithm, this determines the importance given to Smoothness relative to Compression and Accuracy. Smoothness is the tangency of consecutive segments - how close the end angle of a segment is to the start angle of the next segment. Range: > 0, <= 10 For the Curvefit algorithm, this determines the importance given to Accuracy relative to Compression and Smoothness. Accuracy is how closely the resulting curve overlays the original. Range: > 0, <= 10 For the NURBfit algorithm, this determines the degree of the basis polynomials used to approximate the input curve. The degree must be greater than one and less than the number of vertices in the line minus two. Range: >= 2 For the NURBfit algorithm, this determines the segment lengths of the output curve. If this is set to 0, then the output curve will have 10 times the number of points as the input curve. Range: >= 0 For the Inflection algorithm, this determines the number of points used to smooth the curve for the purpose of inflection detection. The additional points are weighted with Guassian weights. Range: >= 0 Description The @Generalize function modifies a feature's geometry by removing its points or by calculating new positions for its points. Note: There are two issues to consider before performing generalization. The first is that the feature should have no self-intersections. The second is that the generalization value should be less than any narrow corridors within the feature. For performance reasons, the @Generalize function does not ensure either of these conditions. If you think your data might have self-intersections, then you should first run it through the IntersectionFactory and then perform @Generalize on the resulting pieces. To detect and fix any narrow corridors that are less than the generalization value, you can use IntersectionFactory after @Generalize is performed. Two algorithms are available for point thinning or generalizing: Douglas and Deveau. Douglas Algorithm The Douglas algorithm takes only a parameter to the amount of point thinning performed. The Douglas algorithm removes points from the original line, but does not adjust the location of the remaining points. The Douglas algorithm is described in the following publication: David H. Douglas and Thomas K. Peuker, "Algorithms for the Reduction of the Number of Points Required to Represent a Digitized Line or Its Caricature," CANADIAN CARTOGRAPHER, Vol. 10, No. 2, December 1973, pp 112- 122 Deveau Algorithm The Deveau algorithm both removes points and adjusts the locations of the remaining points, resulting in a smoother generalization. In addition to the parameter, the Deveau algorithm also takes and parameters to control the generalization. These parameters must be tuned for particular applications to produce aesthetically pleasing results. The Deveau algorithm is fully described in: AutoCarto VII proceedings in the paper "Reducing the Number of Points in a Plane Curve Representation," by Terry J. Deveau. Note: The generalized output feature's geometry will always be 2D, because of the way algorithm works. During generalization, existing vertices may be moved and/or new vertices introduced, which invalidates the z coordinate for the vertices. Thin Algorithm The Thin algorithm is a simple vertex thinning algorithm that only removes points - it does not adjust the locations of the remaining points. Points are removed such that the distance between adjacent vertices is guaranteed to be larger than the parameter. ThinNoPoint Algorithm The ThinNoPoint algorithm is the same as the Thin algorithm, except that the beginning and end points of lines are never moved. If the entire length of the feature being thinned is less than the tolerance, then the feature is replaced by a linear feature connecting the first point to the last point. McMaster Algorithm The McMaster algorithm smooths a feature by determining a new location for each of its points. The McMaster algorithm works by determining a new location for a point by taking the average of its x and y locations and the x and y locations of neighboring points. It then slides the point towards its original location. The overall effect is that each point is pulled towards its neighbors. McMasterWeightedDistance Algorithm The McMasterWeightedDistance algorithm works the same as the McMaster algorithm only it uses an inverse distance weighting formula to take into account the distance from a point to its neighboring points. Points that are close will have more pull on each other than points that are further apart. Curvefit Algorithm The Curvefit algorithm works by removing vertices and replacing the line segments between them with arcs. The process can be customized to emphasize vertex reduction, absolute fit, or tangency (smoothness) between segments. Tip: The @Generalize function logs statistics about the number of points it removed using each algorithm. Note: As the values of the weight settings increase, the number of potential solutions considered increases in steps. If any weight setting is 2 or greater, an increased number of potential solutions is computed; if any weight setting is 5 or greater, the maximum number of potential solutions is calculated. This increases the probability of finding the optimum solution, but also increases computation time. McMasterWeightedDistance Algorithm The McMasterWeightedDistance algorithm works the same as the McMaster algorithm only it uses an inverse distance weighting formula to take into account the distance from a point to its neighboring points. Points that are close will have more pull on each other than points that are further apart. NURBfit Algorithm The NURBfit algorithm will approximate the input curve with uniform non- rational B-splines. The output curve will have the same start and end points as the input curve if the curve is open. If the curve is closed, then the start and end points do not have any bearing on the calculations. Inflection Algorithm The Inflection algorithm will convert an input curve into a set of points representing the curve's inflection points. If the input curve is noisy and the algorithm produces too many spurious inflection points, then the numNeighbors parameter can be increased to effectively smooth the line for the purpose of the inflection point calculations. TO BE RESOLVED HANDLE_ARCS_AND_ELLIPSES_AND_TEXT argument added to Syntax section above, but not documented. This argument is deprecated, but must be kept for legacy purposes.